lightning-ai / lightning-habana Goto Github PK
View Code? Open in Web Editor NEWLightning support for Intel Habana accelerators.
License: Apache License 2.0
Lightning support for Intel Habana accelerators.
License: Apache License 2.0
pytorch_lightning
users who install this package, get lightning
installed automatically
pip install pytorch_lightning
pip install lightning-habana
pip list | grep lightning
lightning habana should use the existing lightning implementation and not install one
This will mean that it requires that lightning is installed first.
cc @Borda
Running multiple agents on a single bare-metal machine. As most of the tests are not compute-extensive that shall not limit or increase the execution time...
Boost the CI by adding new agents to process the queue while preserving the cost
We would change the multi-card scenario such that we won't run on all 8cards but assuming that 2 cards validation is sufficient for preserving the package quality
HPU allows setting card visibility so we can for each agent/flow adjust this visibility only for particular indexes, see: https://docs.habana.ai/en/latest/PyTorch/PT_Multiple_Tenants_on_HPU/index.html
LightningCLI support for external accelerators
LightningCLI helps avoid boilerplate code for command line tools. The current implementation does not seem to support external accelerators, and it only accepts the accelerators present in lightning source.
Extend support for external accelerators in LightningCLI.
First mentioned in #54
from lightning.pytorch import Trainer
from lightning.pytorch.demos.boring_classes import BoringModel
from lightning.pytorch.cli import LightningCLI
from lightning_habana import HPUAccelerator
class BMAccelerator(BoringModel):
def on_fit_start(self):
assert isinstance(self.trainer.accelerator, HPUAccelerator), self.trainer.accelerator
model = BMAccelerator
accelerator = HPUAccelerator()
if __name__ == "__main__":
# Method 1, Passing supported accelerator class instance from an external library
cli = LightningCLI(model, trainer_defaults={'accelerator': accelerator}
# Method 2, passing accelerator as string
cli = LightningCLI(model, trainer_defaults={'accelerator': 'hpu'}
Gives the following tracebacks:
Traceback (most recent call last):
File "temp.py", line 34, in <module>
cli = LightningCLI(model, trainer_defaults={'accelerator': HPUAccelerator()})
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 353, in __init__
self._run_subcommand(self.subcommand)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 642, in _run_subcommand
fn(**fn_kwargs)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 520, in fit
call._call_and_handle_interrupt(
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/call.py", line 44, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 559, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 893, in _run
self.strategy.setup_environment()
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/strategies/strategy.py", line 127, in setup_environment
self.accelerator.setup_device(self.root_device)
File "/home/agola/lightning-habana-fork/src/lightning_habana/pytorch/accelerator.py", line 50, in setup_device
raise MisconfigurationException(f"Device should be HPU, got {device} instead.")
lightning.fabric.utilities.exceptions.MisconfigurationException: Device should be HPU, got cpu instead.
Traceback (most recent call last):
File "temp.py", line 33, in <module>
cli = LightningCLI(model, trainer_defaults={'accelerator': "hpu"})
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 353, in __init__
self._run_subcommand(self.subcommand)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 642, in _run_subcommand
fn(**fn_kwargs)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 520, in fit
call._call_and_handle_interrupt(
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/call.py", line 44, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 559, in _fit_impl
self._run(model, ckpt_path=ckpt_path)
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 916, in _run
call._call_lightning_module_hook(self, "on_fit_start")
File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/call.py", line 142, in _call_lightning_module_hook
output = fn(*args, **kwargs)
File "temp.py", line 15, in on_fit_start
assert isinstance(self.trainer.accelerator,
AssertionError: <lightning.pytorch.accelerators.hpu.HPUAccelerator object at 0x7f37f62917c0>
lightning 2.0.0
lightning-fabric 2.0.3
lightning-habana 1.0.0
lightning-utilities 0.9.0
pytorch-lightning 2.0.5
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.