Skip to content

Feature request: Log optimizer and LR Scheduler hyperparams when set by LightningCLIΒ #13577

@lodo1995

Description

@lodo1995

Discussed in #11258

Originally posted by slinnarsson December 25, 2021
I'm using LightningCLI and setting the optimizer and learning rate using args like --optimizer AdamW --optimizer.lr 0.01. When running multiple runs with varying optimizers and learning rates, I would like these hyperparameters to show up in tensorboard, but I don't know how to make that happen. Using save_hyperparameters() in the __init__() of my module doesn't work, because the optimizer and learning rate are not parameters to __init_(). As I understand it, the optimizer is patched into the module class by LightningCLI. Can I make LightningCLI log these as hyperparameters?

Btw, the settings are saved correctly in config.yaml, they just don't show up in hparams.yaml.

Update: I found a workaround; I added optimizer and lr as arguments to __init__() even if I don't use them there. They get saved by save_hyperparameters() and show up in the Tensorboard logs.


It would be nice if LightningCLI could automate this, without the need for "hacks" in the model class. In my usecase, I specify the optimizer classpath and its init args directly in a yaml configuration file that I pass to the cli.

cc @Borda @carmocca @mauvilsa @akihironitta

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureIs an improvement or enhancementhelp wantedOpen to be worked onlightningclipl.cli.LightningCLIplGeneric label for PyTorch Lightning package

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions