Custom dataloader behaviour #7812
Unanswered
juneskiafc
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 1 reply
-
@juneskiafc can you implement the dataloader update/replacement in the |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm looking to change the dataloader every epoch.
The context is multiple instance learning, where top k patches are selected from each large image, and these patches are different across epochs.
To do this, first I need to do one complete pass over all patches in an image. I do this with a separate dataloader, in the
on_epoch_start
hooks.Then, after I do this inference pass, using its results I want to reload the dataloader inside the
LightningModule
so that regulartraining_step
,validation_step
,test_step
will read the top k patches only. However, I don't know of an easy way to do this with pytorch-lightning.Does anyone have any ideas? It would be greatly appreciated.
Beta Was this translation helpful? Give feedback.
All reactions