How to access scaler if using amp in training_step() using manual optimzation? #11290
Unanswered
maxmatical
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
I think it can be access using: self.trainer.precision_plugin.scalar |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to use a non-standard optimizer, so in the training step i can't just call
optimizer.step()
, the pytorch code (using amp) would beif i want to translate this to lightning, how do i access the scaler? i'm thinking in the
training_step
the code will look something like (using manual optimization)What would be the right way to access the scaler within the lightning module?
Beta Was this translation helpful? Give feedback.
All reactions