Performing multiple optimizer steps for each training step? #14764
Unanswered
stwerner97
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
Here's the doc page: https://pytorch-lightning.readthedocs.io/en/1.7.7/guides/data.html?highlight=tbptt#truncated-backpropagation-through-time-tbptt |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I am working with sequence data of considerable lengths and use truncated backpropagation through time to train an LSTM model. Given the large sequence lengths, I want to perform intermediate updates to the network. So far, training steps process batches of sequences split into TBTT chunks.
Is it possible to perform multiple gradient updates in
training_step
?Beta Was this translation helpful? Give feedback.
All reactions