How to pass losses to optimizer_step? #13075
Unanswered
AnkushMalaker
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
i'd rather suggest using ref: https://pytorch-lightning.readthedocs.io/en/stable/common/optimization.html#manual-optimization |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to use this multi-term Adam optimizer which requires me pass an array of losses to optimizer.step() as shown:
loss.backwards()
is also called within the optimizer code. Here's what my attempt looks like:backward()
in model code and simplypass
to disableloss.backward()
by lightning.In step 2, I need tp pass the losses to the optimizer_step() function. How can I do that, or a better way to achieve the same?
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions