set_to_none=True and accumulate_grad_batches #6703
Answered
by
awaelchli
denix56
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Is it possible to use automatic optimization with accumulate_grad_batches and performance trick zero_grad(set_to_none=True)? |
Beta Was this translation helpful? Give feedback.
Answered by
awaelchli
Apr 4, 2021
Replies: 1 comment 1 reply
-
Yes, you can override the zero_grad hook in LightningModule def optimizer_zero_grad(self, epoch: int, batch_idx: int, optimizer: Optimizer, optimizer_idx: int):
optimizer.zero_grad(set_to_None=True) |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
carmocca
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yes, you can override the zero_grad hook in LightningModule