Skip to content

Do LightningDataModule support multi train dataloader like LightningModule? #11024

Discussion options

You must be logged in to vote

@kleinzcy @tshu-w,

PyTorch Lightning supports multiple data loaders but they will be sampled at the same time and return a batch composed of a batch sample for each dataloaders return for the train_dataloader method.

Alternatively, if you want to make this sequential, you can implement a wrapper which will sample from one and the other dataloader.

class SequentialLoader(Iterator):

    def __init__(self, *dataloaders):
        self.dataloaders = dataloaders

    def __len__(self):
        return sum([len(dl) for dl in self.dataloaders])

    def __iter__(self):
        for dl in self.dataloaders:
            dataloader_iter = iter(dl)
            for batch in dataloader_iter:
            …

Replies: 2 comments 5 replies

Comment options

You must be logged in to vote
3 replies
@kleinzcy
Comment options

@tshu-w
Comment options

@kleinzcy
Comment options

Comment options

You must be logged in to vote
2 replies
@jlehrer1
Comment options

@tshu-w
Comment options

Answer selected by kleinzcy
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment