Skip to content

How to change optimizer and lr scheduler in the middle of training #11508

Discussion options

You must be logged in to vote

@icedpanda Lightning supports multiple optimizers. You can define multiple optimizers and LR_schedules in LightningModule.configure_optimizers(), for example:

class yourModule(LightningModule):
    def __init_(..):

    def configure_optimizers(self):
        optimizer1 = torch.optim.Adam(params, lr)
        optimizer2 = torch.optim.Adam(params, lr)
        lr_scheduler1 = torch.optim.lr_scheduler.StepLR(optimizer1)
        lr_scheduler2 = torch.optim.lr_scheduler. ExponentialLR(optimizer2)
        return [optimizer1, optimizer2], [lr_scheduler1, lr_scheduler2]

   def training_step(self, batch, batch_idx, optimizer_idx):
        if optimizer_idx == 0:
            # do training_step when …

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@icedpanda
Comment options

@quancs
Comment options

@quancs
Comment options

@icedpanda
Comment options

Answer selected by icedpanda
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment