Code Monkey home page Code Monkey logo

lightning-habana's People

Contributors

ankitgola005 avatar borda avatar dependabot[bot] avatar jerome-habana avatar jyothisambolu avatar pre-commit-ci[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

lightning-habana's Issues

Forced lightning install

๐Ÿ› Bug

pytorch_lightning users who install this package, get lightning installed automatically

To Reproduce

pip install pytorch_lightning
pip install lightning-habana
pip list | grep lightning

Expected behavior

lightning habana should use the existing lightning implementation and not install one
This will mean that it requires that lightning is installed first.

cc @Borda

extend CI with multiple agents on single machine

๐Ÿš€ Feature

Running multiple agents on a single bare-metal machine. As most of the tests are not compute-extensive that shall not limit or increase the execution time...

Motivation

Boost the CI by adding new agents to process the queue while preserving the cost

Limitations

We would change the multi-card scenario such that we won't run on all 8cards but assuming that 2 cards validation is sufficient for preserving the package quality

Additional context

HPU allows setting card visibility so we can for each agent/flow adjust this visibility only for particular indexes, see: https://docs.habana.ai/en/latest/PyTorch/PT_Multiple_Tenants_on_HPU/index.html

LightningCLI support for external accelerators

๐Ÿš€ Feature

LightningCLI support for external accelerators

Motivation

LightningCLI helps avoid boilerplate code for command line tools. The current implementation does not seem to support external accelerators, and it only accepts the accelerators present in lightning source.

Pitch

Extend support for external accelerators in LightningCLI.

Alternatives

Additional context

First mentioned in #54

To reproduce:

from lightning.pytorch import Trainer
from lightning.pytorch.demos.boring_classes import BoringModel
from lightning.pytorch.cli import LightningCLI
from lightning_habana import HPUAccelerator

class BMAccelerator(BoringModel):
    def on_fit_start(self):
        assert isinstance(self.trainer.accelerator, HPUAccelerator), self.trainer.accelerator

model = BMAccelerator
accelerator = HPUAccelerator()

if __name__ == "__main__":

    # Method 1, Passing supported accelerator class instance from an external library
    cli = LightningCLI(model, trainer_defaults={'accelerator': accelerator}

    # Method 2, passing accelerator as string
    cli = LightningCLI(model, trainer_defaults={'accelerator': 'hpu'}

Gives the following tracebacks:

Method 1, passing supported accelerator class instance from an external library

Traceback (most recent call last):
  File "temp.py", line 34, in <module>
    cli = LightningCLI(model, trainer_defaults={'accelerator': HPUAccelerator()})
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 353, in __init__
    self._run_subcommand(self.subcommand)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 642, in _run_subcommand
    fn(**fn_kwargs)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 520, in fit
    call._call_and_handle_interrupt(
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/call.py", line 44, in _call_and_handle_interrupt
    return trainer_fn(*args, **kwargs)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 559, in _fit_impl
    self._run(model, ckpt_path=ckpt_path)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 893, in _run
    self.strategy.setup_environment()
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/strategies/strategy.py", line 127, in setup_environment
    self.accelerator.setup_device(self.root_device)
  File "/home/agola/lightning-habana-fork/src/lightning_habana/pytorch/accelerator.py", line 50, in setup_device
    raise MisconfigurationException(f"Device should be HPU, got {device} instead.")
lightning.fabric.utilities.exceptions.MisconfigurationException: Device should be HPU, got cpu instead.

Method 2, passing accelerator as string

Traceback (most recent call last):
  File "temp.py", line 33, in <module>
    cli = LightningCLI(model, trainer_defaults={'accelerator': "hpu"})
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 353, in __init__
    self._run_subcommand(self.subcommand)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/cli.py", line 642, in _run_subcommand
    fn(**fn_kwargs)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 520, in fit
    call._call_and_handle_interrupt(
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/call.py", line 44, in _call_and_handle_interrupt
    return trainer_fn(*args, **kwargs)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 559, in _fit_impl
    self._run(model, ckpt_path=ckpt_path)
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/trainer.py", line 916, in _run
    call._call_lightning_module_hook(self, "on_fit_start")
  File "/home/agola/anaconda3/envs/plt_3.8/lib/python3.8/site-packages/lightning/pytorch/trainer/call.py", line 142, in _call_lightning_module_hook
    output = fn(*args, **kwargs)
  File "temp.py", line 15, in on_fit_start
    assert isinstance(self.trainer.accelerator,
AssertionError: <lightning.pytorch.accelerators.hpu.HPUAccelerator object at 0x7f37f62917c0>

Env

lightning                     2.0.0
lightning-fabric              2.0.3
lightning-habana              1.0.0
lightning-utilities           0.9.0
pytorch-lightning             2.0.5

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.