Data augmentation and reload_dataloaders_every_epoch #8504
-
Hi! I'm training my neural network with Pytorch Lightning and MONAI (a PyTorch-based framework for deep learning in healthcare imaging). Because my training dataset is small, I need to perform data augmentation using random transforms. Context I use MONAI's In the video presenting the
My questions Did Lightning add a cache mechanism to load the data once? Must I use the Thanks in advance for your explanation :) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
No, the flag just means that we call
If I understand correctly, no. |
Beta Was this translation helpful? Give feedback.
No, the flag just means that we call
LightningModule.train_dataloader()
every epoch if enabled, thus creating a newDataLoader
instance.If I understand correctly, no.
The transformations are applied directly in the
Dataset
, so every time an item is consumed from it, the random transforms should be applied regardless of whether theDataLoader
has or hasn't been recreated.