Using steps rather than epochs for learning rate scheduler #12518
Unanswered
cmlakhan
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
Hi @cmlakhan! You can call lr scheduler's def configure_optimizers(self):
optimizer = torch.optim.AdamW(self.parameters(), lr=2e-3, weight_decay=0.000001)
warmup_scheduler = transformers.get_constant_schedule_with_warmup(optimizer, num_warmup_steps=5000)
+ lr_scheduler_config = {'scheduler': warnup_scheduler, 'interval': 'step'}
+ optimizer_dict = {'optimizer': optimizer, 'lr_scheduler': lr_scheduler_config)
- optimizer_dict = {'optimizer': optimizer, 'lr_scheduler': warmup_scheduler}
return optimizer_dict Also, we have a few examples in our documentation at: https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html#configure-optimizers |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'd like to use the transformers.get_constant_schedule_with_warmup learning rate scheduler from HuggingFace but I get the impression that when I set the "num_warmup_steps" value it corresponds to epochs rather than steps. What is the proper way to change this from epochs to steps? This is the way I have it coded
Beta Was this translation helpful? Give feedback.
All reactions