Skip to content

How to switch data loaders between epochs while using multiple optimizer #14420

Discussion options

You must be logged in to vote

one simple solution is to use reload_dataloaders_every_epoch=1 and configure training_step based on current_epoch.

def training_step(self, batch, batch_idx, optimizer_idx):
    if (self.current_epoch % 2 == 0 and optimizer_idx == 0) or (self.current_epoch % 2 == 1 and optimizer_idx == 1):
        ...
        return loss

in other cases of training_step, it will return None, which will not make any updates to the other optimizer. You might get a warning, but that's fine.
your solution is fine as well, but it will load the data from both the dataloader which is not required in each epoch.

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@icedpanda
Comment options

@rohitgr7
Comment options

@icedpanda
Comment options

Answer selected by icedpanda
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment