Skip to content

What is the relationship beween accumulate_grad_batches and lr_scheduler? #10651

Discussion options

You must be logged in to vote

yes, step means optimization step and accumulate_grad_batches will be taken under consideration while calling the lr_scheduler.
Ref code:
https://github.com/PyTorchLightning/pytorch-lightning/blob/8ea39d2c8f68cc33273c3431a310a262e2240cf9/pytorch_lightning/loops/epoch/training_epoch_loop.py#L434-L437

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@yc1999
Comment options

Answer selected by yc1999
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment