How to load the parameters of optimizer and schedulers dynamically? #13371
Unanswered
marsggbo
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
You can utilise PyTorch Lightning's checkpoints as they include states of optimizer and learning rate schedulers, too: trainer.save_checkpoint("example.ckpt")
checkpoint = torch.load("example.ckpt") See the documentation page for details: https://pytorch-lightning.readthedocs.io/en/stable/common/checkpointing.html#checkpoint-contents |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My experiment setting is like this:
trainer1
to trainmodel1
trainer2-1
,model2-1
) and (trainer2-2
,model2-2
). Both need to inherit the model/optimizer/scheduler weights fromtrainer1
andmodel1
, and are trained separately.model.load_from_checkpoint
ormodel.load_state_dict
only load model weights, but optimizer and scheduler's parameters are ignored. I wonder how to load the parameters of optimizers and schedulers dynamically?Beta Was this translation helpful? Give feedback.
All reactions