How to train two optimizer with one loss? #6818
-
I have read the GAN Demo, it is for two losses. Suppose I have two modules A and B. The training_step is:
How to train A and B with different optimizer or learning rate? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
The best is to use manual optimization: Then you have control over each optimizer step. You can get the optimzers with self.optimizers() Hope this helps |
Beta Was this translation helpful? Give feedback.
The best is to use manual optimization:
https://pytorch-lightning.readthedocs.io/en/latest/common/optimizers.html#manual-optimization
Then you have control over each optimizer step.
You can get the optimzers with self.optimizers()
and for backward you simply have to replace loss.backward() with self.manual_backward(loss)
Make sure you use the latest version of Lightning for that for optimal support.
Hope this helps