Train different models in parallel #12548
Unanswered
aliutkus
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, thanks for the awesome work
I would like to train several models in parallel, that would ideally share the same dataloader, but have non shared parameters, and each one its loss.
Could be implemented like passing a list to a trainer somehow.
Giving a bit of context:
Is there a way to do that ?
thanks
Beta Was this translation helpful? Give feedback.
All reactions