How to make each optimizer in a LightningModule take a unique number of training_step? #14369
Unanswered
mkarikom
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 2 comments
-
Simply looping over Admittedly, I don't fully understand the limitations of this approach so any comments are definitely welcome |
Beta Was this translation helpful? Give feedback.
0 replies
-
have you checked out the examples here: https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html?highlight=configure_optimizers#configure-optimizers especially the last one |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Question
Is there a way to parameterize the Trainer be coded such that each optimizer defined in the LightningModule can take a unique number of steps?
Background Research
In the GAN LightningModule example, there does not seem to be a way to call backward within
training_step
s.t. the conditional over optimizer indices can be used to define a unique number of steps for each optimizer.Motivation
In Algorithm 1 of the GAN paper, the discriminator runs for k steps (unique samples from the generator) for each generator step. This heuristic is designed to promote more efficient mapping of the data space by the generator.
Beta Was this translation helpful? Give feedback.
All reactions