Do LightningDataModule support multi train dataloader like LightningModule? #11024
Answered
by
tchaton
kleinzcy
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Can I define multi training dataloaders in LightningDataModule?
|
Beta Was this translation helpful? Give feedback.
Answered by
tchaton
Dec 10, 2021
Replies: 2 comments 5 replies
-
yes |
Beta Was this translation helpful? Give feedback.
3 replies
-
PyTorch Lightning supports multiple data loaders but they will be sampled at the same time and return a batch composed of a batch sample for each dataloaders return for the train_dataloader method. Alternatively, if you want to make this sequential, you can implement a wrapper which will sample from one and the other dataloader. class SequentialLoader(Iterator):
def __init__(self, *dataloaders):
self.dataloaders = dataloaders
def __len__(self):
return sum([len(dl) for dl in self.dataloaders])
def __iter__(self):
for dl in self.dataloaders:
dataloader_iter = iter(dl)
for batch in dataloader_iter:
yield batch |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
kleinzcy
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@kleinzcy @tshu-w,
PyTorch Lightning supports multiple data loaders but they will be sampled at the same time and return a batch composed of a batch sample for each dataloaders return for the train_dataloader method.
Alternatively, if you want to make this sequential, you can implement a wrapper which will sample from one and the other dataloader.