How to freely change LR scheduler during training #7645
Unanswered
LLNLanLeN
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a question regarding Trainer and LightningModule. I want to be able to freely change the LR scheduler (while keeping the Optimizer the same) during the training step. This may have been answered before but I haven't been able to find a working solution so far. I found one thread that related the most to my question, but unfortunately I could not get it to work:
#3095
My original idea was similar to what suggested in the issue below, to create a simple callbacks and swap out the old LR Scheduler with a new one (just LR scheduler, I want to keep the optimizer the same). But I didn't have much success trying the solution above.
Note: the Pytorch Lightning version that I'm using is
1.1.8
, and not the latest, which is1.4.0
. Some of the custom packages that I'm using is more compatible with1.1.8
instead of1.4.0
so I opt for the older version instead.This is my implementation:
I ran into this error with my implementation:
Beta Was this translation helpful? Give feedback.
All reactions