SAM with lr scheduler results in MisconfigurationException #10789
Unanswered
maxmatical
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 2 comments 6 replies
-
i modified
that seems to work, albeit a bit hacky |
Beta Was this translation helpful? Give feedback.
0 replies
-
what if you set your scheduler like: scheduler = OneCycleLR(
optimizer,
max_lr=self.lr,
pct_start=0.3,
total_steps=self.total_steps
) it won't work? |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This is what my
configure_optimizers()
looks like, trying to use the SAM optimizer with pytorch lightningbut when i try to fit the model, i get the following error
has anyone been successful in using SAM with lr schedulers in pytorch lightning?
Beta Was this translation helpful? Give feedback.
All reactions