How to reload data loader inside an epoch. #7977
Unanswered
jfkback
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
I think that you can bypass it by adding this dataloader reload in the validation end and set Trainer's argument |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am using PL with a PyTorch data loader and am passing it to the trainer. I would like to have the data loader update during training so that new data is used. I would just use the
reload_dataloaders_every_epoch
option, but I want this to happen in the middle of an epoch as one epoch is very long and I want the data to be updated faster than it takes for an epoch to complete. Is there any way to do this? I can understand it might be difficult as the data loader will be iterated over while training so changing it could lead to problems. One other possibility would be to shorten the epochs so that each epoch only does a small part of the total data, is there a way to partition a data loader like this?Beta Was this translation helpful? Give feedback.
All reactions