Override hyparameters when continuning to train from checkpoints #7155
Unanswered
LeCongThuong
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
Hi Can you try this? (make milestones a model attribute and reference it in the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I can not find any way to override hyparameters when continuing to train with pl.Trainer(resume_from_checkpoint='path_to_checkpoint')
In function configure_optimizers(self) in my model, I setup variable head_lr_scheduler
When continuing to train, I want to change milestones from [20, 40, 81, 120] to [20, 40, 81, 120, 140, 180] but model does not change. I realized that by using lr_monitor callbacks
In main function:
Any help!!!
Beta Was this translation helpful? Give feedback.
All reactions