Training two models at once with differing schedules #11847
Unanswered
TheSeparatrix
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi again!
I am currently trying out a sort of adversarial autoencoder where a classifier is being trained on the auto encoder latent space while the autoencoder also trains regularly.
I am wondering if there is a way in PyTorch-Lightning to run some epochs of classifier training before each epoch of the autoencoder training.
Something like:
For each training epoch:
Train classifier for 20 epochs on current latent space.
THEN train autoencoder 1 epoch.
Can I somehow specify this elegantly in the model (maybe in
training_step
or inTrainer
)?Beta Was this translation helpful? Give feedback.
All reactions