[FEAT] Add example of modifying the default configure_optimizers() behavior (use of ReduceLROnPlateau scheduler)#1015
Conversation
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
|
Hey @JQGoh, thanks a lot for trying this approach! Although I think it's fine to add more arguments for now, we can give this a shot when we work towards v2.0. |
In that case, I shall put this on hold then. |
3924876 to
eca5d9b
Compare
nbdev_clean --clear_all remove unnecessary changes
eca5d9b to
2e3d7c6
Compare
|
@jmoralez @elephaint @marcopeix |
|
This breaking change shall fix #1096 |
lightning by passing the specified function call
|
@jmoralez I have modified it based on your suggestion. |
|
Thanks for your tips on using a subclass approach to customize the default optimizer behavior. Moving forward I can see a few practical ways to support the NeuralForecast users regarding this. Options
Options 2 & 3 shall introduce a breaking change and require us to update the NeuralForecast to version 3.0.0. Personally, I prefer the option 1 but I would like to hear more the feedback from the Nixtla's team regarding this. |
|
Thanks. Option 1 sounds good. |
This reverts commit c5f20a1.
|
@elephaint @jmoralez Mind helping to review this? Providing an example of how to address #1096 |
Rationale
configure_optimizers()behavior and how we can useReduceLROnPlateauscheduler during the optimizationInstead of adding arguments to all the models, passing from NeuralForecast class, we introduce aset_configure_optimizersfunction for the BaseModel class such that individual model can overwrite the default configure_optimizers() behaviorNote that this requires users to specify bothoptimizerandschedulerto have an effective optimization configuration, otherwise it shall fallback to the default option.This shall deprecate the earlier work introduced in: #998