Why is on_before_optimizer_step incompatible with accumulate_grad_batches ? #11331
Answered
by
rohitgr7
NathanGodey
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
The documentation about the
However, in theory, combining both seems possible to me. |
Beta Was this translation helpful? Give feedback.
Answered by
rohitgr7
Jan 5, 2022
Replies: 1 comment 2 replies
-
hey @NathanGodey ! it doesn't mean that it won't be called if |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
NathanGodey
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
hey @NathanGodey !
it doesn't mean that it won't be called if
accumulate_grad_batches > 1
, but only be called when accumulation is done and it's time to update the gradients withoptimizer.step
.