Comments (4)
As a callback you proposed this function that updates the lr argument in the partial wrapper:
def partial_setattr(fn: functools.partial, key: str, value: float) -> None:
*_, (f, args, kwargs, n) = fn.__reduce__()
kwargs[key] = value
fn.__setstate__((f, args, kwargs, n))
My question is why couldn't you define your configure_optimizers()
this way (pseudo code):
class LitModel:
def __init__(self, optimizer_cls):
self.optimizer_cls = optimizer_cls
self.learning_rate = None # or a default value
def configure_optimizers(self):
if self.learning_rate is not None:
partial_setattr(self.optimizer_cls, "lr", self.learning_rate)
return self.optimizer_cls(self.parameters())
from pytorch-lightning.
@arthurdjn Thanks for the feature request.
Currently I don't understand the use case 100%. My question is, since the contract between Trainer and LM is that the configure_optimizers()
hook returns the optimizer, what would that implementation look like in your case? This is where you'd normally configure the learning rate. In other words, why couldn't that special callable you have be applied there?
The attribute the tuner saves the new learning rate to is only a temporary holder in that sense. Ultimately, it needs to be set in configure_optimizers()
.
from pytorch-lightning.
The use case is that I don't want to pass the learning rate as an attribute to the lightning module, instead I want to pass the partially instantiated optimizer. With this approach, it is not possible to use the Tuner or LearningRateFinder.
# Using the custom LitModel with partial optimizer instead of attr defined learning rate
optimizer = functools.partial(Adam, lr=0.001)
model = LitModel(optimizer)
trainer = Trainer()
tuner = Tuner(trainer)
# This does not work
lr_finder = tuner.lr_finder(model, attr_name=???) # Not possible to access the learning rate
Maybe I am missing something, but I think this use case is a current limitation of the API thus this feature proposal.
from pytorch-lightning.
Well, I agree that your solution is pretty simple and clean. I didn't want to have an extra learning_rate
attribute in the model to avoid having the same value in different places. I thought that providing the setter function directly to the Tuner might be a clearer alternative. However, I think it's a good alternative.
What do you think?
from pytorch-lightning.
Related Issues (20)
- Lightning Fabric: generic method to get the full state dict
- ModelCheckpoint does not work when using the monitor HOT 1
- Continuing training with `ckpt_path="last"` and MLFLowLogger fails in distributed setting
- is `lightning` and `pytorch_lightning` the same? HOT 4
- FileNotFoundError: [Errno 2] No such file or directory tfevents file
- `grep: Invalid option -- P` when running `./tests/run_standalone_tests.sh` on macOS HOT 1
- Callback for logging forward, backward and update time
- Custom batch selection for logging HOT 3
- `make test` fails with `subprocess-exited-with-error`: `AssertionError: Could not find cmake executable!`
- Avoid casting with `numpy()` in `multiprocessing.py` HOT 1
- Autocast "cache_enabled=True" failing HOT 1
- Official docker image doesn't have pytorch_lightning
- Class name displayed incorrectly
- Make TensorBoardLogger default version creation ascii sortable
- Adam optimizer is slower after loading model from checkpoint HOT 9
- ValueError: range() arg 3 must not be zero - Need to Identify the Root Cause HOT 1
- Logging Hyperparameters for list of dicts
- Returning num_replicas=world_size when using distributed sampler in ddp HOT 2
- Documentation: writing custom samplers compatible with multi GPU training
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pytorch-lightning.