Skip to content

LightningModule -> configure_optimizers() return type #20937

@jmoerk123

Description

@jmoerk123

Bug description

In the documentation of the configure_optimizers it says that:

Returns:
Any of these 6 options.

  • •••
  • Dictionary, with an "optimizer" key, and (optionally) a "lr_scheduler" key whose value is a single LR scheduler or lr_scheduler_config.
  • None - Fit will run without any optimizer.

When I set the return type equal to the default one (OptimizerLRScheduler) I get the mypy error:

error: Incompatible types (expression has type "dict[str, object]", TypedDict item "lr_scheduler" has type "LRScheduler | ReduceLROnPlateau | LRSchedulerConfigType")  [typeddict-item]

A possible solution could be something like adding Mapping[Literal["optimizer", "lr_scheduler"], Union[Optimizer, LRSchedulerTypeUnion, LRSchedulerConfig]], to the OptimizerLRScheduler type.

What version are you seeing the problem on?

v2.5

Reproduced in studio

No response

How to reproduce the bug

class Model(LightningModule):
    def __init__(
        self,
        model: Aurora,
        loss_fn: nn.Module,
        lr: float,
        warmup_len: int = 1000,
    ) -> None:
        super().__init__()
        self.model = model
        self.loss_fn = loss_fn
        self.lr = lr
        self.warmup_len = warmup_len

    def forward(self, x: Batch) -> Batch:
        return self.model(x)

    def training_step(self, batch: dict[str, Batch]) -> Tensor:
        input, target = batch["input"], batch["target"]
        output = self.forward(input)
        loss = self.loss_fn(output, target)
        self.log(
            "train_loss",
            loss,
            on_step=True,
            on_epoch=True,
            prog_bar=True,
            # sync_dist=True,
        )
        return loss

    def configure_optimizers(self) -> OptimizerLRScheduler:
        optimizer = AdamW(self.parameters(), lr=self.lr, weight_decay=0.0)

        warmup_scheduler = LinearLR(
            optimizer,
            start_factor=1e-8,  # start near zero
            end_factor=1.0,
            total_iters=self.warmup_len,
        )

        lr_scheduler_config = {
            "scheduler": warmup_scheduler,
            "interval": "step",
            "frequency": 1,
        }

        return {"optimizer": optimizer, "lr_scheduler": lr_scheduler_config}

Error messages and logs

error: Incompatible types (expression has type "dict[str, object]", TypedDict item "lr_scheduler" has type "LRScheduler | ReduceLROnPlateau | LRSchedulerConfigType")  [typeddict-item]

Environment

Current environment
#- PyTorch Lightning Version: 2.5.1.post0
#- PyTorch Version: 2.7.1+cu128)
#- Python version: 3.13.1
#- OS: Linux
#- CUDA/cuDNN version: 12.8
#- How you installed Lightning: pip

More info

If I have overlooked something or are doing anything wrong please let me know.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingneeds triageWaiting to be triaged by maintainersver: 2.5.x

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions