Updating optimizer during training from callback with DDP #13220
Unanswered
fColangelo
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am implementing a callback for progressive resizing, basically updating the resolution of images in the datamodule and the optimizers after certain pre-defined epoch thresholds. Currently I am using the on_train_epoch_start method to do this.
However, I am not really sure how the optimizer (and scheduler) should be updated in order to not break stuff like DDP etc.
I found this issue that suggested to simply update trainer and scheduler manually like so:
Is this compatible with DDP? Is there a recommended way to do this?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions