PyTorch Lightning Optimizer_Step() prevents training_step() from running #11358
-
Hello, I had an error where the training_step() was not run properly. I just found out the cause was because of the optimizer_step(). My training step immediately runs after I commented out optimizer_step(). Some other users also experienced the same error as described here: https://stackoverflow.com/questions/66756245/training-step-not-executing-in-pytorch-lightning My question is: Now that training_step() is running, but my train_loss is explosive because of the lack of a learning rate scheduler, henceforth, what can I implement to re-enable back my learning rate scheduler? Here's my chunk of code:
Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
hey @tsuijenk
|
Beta Was this translation helpful? Give feedback.
hey @tsuijenk
optimizer _closure
must be passed inoptimizer.step()
since it runs training_step and backward call. You can check the docstrings and examples here: https://pytorch-lightning.readthedocs.io/en/latest/common/lightning_module.html#optimizer-step