Multiple dataloaders in training #8410
Answered
by
tchaton
YuShen1116
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Hi, I'm trying to use integrate pytorch lightning into my current pipeline. But I'm having some difficulties in using multiple dataloaders. In my current use case, let's say I have 10 dataloaders in my pipeline, but for each training step, I'm only sampling data from 5 of them. Is it doable in pytorch lightning? I can sample from 10 dataloaders each step but that would be a waste to system IO and GPU memory. Thanks for any help! |
Beta Was this translation helpful? Give feedback.
Answered by
tchaton
Jul 19, 2021
Replies: 1 comment 1 reply
-
Hey @YuShen1116, Here is the pseudo code to get it working
|
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
YuShen1116
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hey @YuShen1116,
Here is the pseudo code to get it working