Correct way to set up multi-task training? #6785
Unanswered
turian
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
We have a NeuroIPS competition in submission, and I would like to use pytorch-lightning for the dev-kit.
I'm trying to figure out the most elegant pattern for using ptl to learn an embedding in a multi-task learning scenario:
Is there an elegant pattern for interleaving mini-batching training of multiple tasks simultaneously, all of which use a shared embedding module?
Beta Was this translation helpful? Give feedback.
All reactions