self.manual_backward() vs. loss.backward() when optimizing manually #11318
Answered
by
rohitgr7
MGheini
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
According to the manual_backward() documentation, it takes care of scaling when using mixed precision. In that case, is it correct to assume one can simply and safely use loss.backward() during manual optimization if not using mixed precision? |
Beta Was this translation helpful? Give feedback.
Answered by
rohitgr7
Jan 4, 2022
Replies: 1 comment 1 reply
-
hey @MGheini It's not just precision but a common hook to support all other strategies like deepspeed/ddp and certain hooks like |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
MGheini
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
hey @MGheini
It's not just precision but a common hook to support all other strategies like deepspeed/ddp and certain hooks like
on_after_backward
are called too. Somanual_backward
is suggested to make sure no-code change is required for eg in case any of the strategies is updated by the user.