Clarification on reload_dataloaders_every_epoch #6635
Answered
by
carmocca
thingsofleon
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
I have a PyTorch Lightning DataModule instance that defines train_dataloader, val_dataloader, and test_dataloader. Currently using a custom callback to reload the train_dataloader that will resample the data. I saw that there is a Trainer flag called reload_dataloaders_every_epoch and soon to be reload_dataloaders_every_n_epochs. Do these just reload the train_dataloader, or do the do all 3? |
Beta Was this translation helpful? Give feedback.
Answered by
carmocca
Apr 20, 2021
Replies: 1 comment 3 replies
-
Only the train and validation dataloaders: |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
carmocca
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Only the train and validation dataloaders:
https://github.com/PyTorchLightning/pytorch-lightning/blob/e4f3a8d3dd534d4ec2fe094280272513e652fba9/pytorch_lightning/trainer/training_loop.py#L168-L170
https://github.com/PyTorchLightning/pytorch-lightning/blob/e4f3a8d3dd534d4ec2fe094280272513e652fba9/pytorch_lightning/trainer/training_loop.py#L203-L207