How to change/modify learning scheduler when use BaseFinetuning? #12993
Unanswered
ZeguanXiao
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When doing transfer learning, we not only want to change the learning rate between phases but also to reinitialize lr_scheduler. While this comment gives an example to introduce a new lr_scheduler, trainer.new_schedule() seems not a valid API. And it is better to be compatible with the LearningRateMonitor callback.
Is any workaround to do this?
Beta Was this translation helpful? Give feedback.
All reactions