What is the relationship beween accumulate_grad_batches and lr_scheduler? #10651
-
I wrote following code: def configure_optimizers(self):
......
return [
{
'optimizer': optimizer,
'lr_scheduler': {
'scheduler': scheduler,
'interval': 'step',
'frequency': 1
}
} I choose In my opinion, (I know |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
yes, |
Beta Was this translation helpful? Give feedback.
yes,
step
means optimization step andaccumulate_grad_batches
will be taken under consideration while calling the lr_scheduler.Ref code:
https://github.com/PyTorchLightning/pytorch-lightning/blob/8ea39d2c8f68cc33273c3431a310a262e2240cf9/pytorch_lightning/loops/epoch/training_epoch_loop.py#L434-L437