Reload train dataloader when using limit_train_batches #13001
-
Hi, I am using an IterableDataset that generates infinite samples and I want it to change its behaviour as the training progresses. To do so, I need to access information about the current training step from inside the Dataset. Thanks in advance. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
this should not happen. Just tried a simple example using BoringModel and it is recalling the train dataloader hook correctly even when using |
Beta Was this translation helpful? Give feedback.
-
I finally could solve it. The problem was that I passed the train and validation dataloaders as arguments to |
Beta Was this translation helpful? Give feedback.
I finally could solve it. The problem was that I passed the train and validation dataloaders as arguments to
trainer.fit()
, sotrain_dataloader()
wasn't called. I ended up implementing__len__()
in the IterableDataset instead of usinglimit_train_batches