DC GAN multiple optimizer loss not being seen in the train_epoch_end
function
#8207
Unanswered
mnswdhw
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone !
Currently I have written the module to train a dc-gan on pytorch lightning. I am facing an issue in the method train_epoch_end, when I received the outputs of an epoch (fast_dev_run method was on), i expected that both the optimizers' loss would be present in the output however only the loss of the generator was present twice/repeated. I saw that this issue was raised earlier as well, but since I installed the latest version of pytorch lightning using pip I expected it to work. I am also attaching my code for reference. Please have a look. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions