-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
Discussed in #11258
Originally posted by slinnarsson December 25, 2021
I'm using LightningCLI and setting the optimizer and learning rate using args like --optimizer AdamW --optimizer.lr 0.01
. When running multiple runs with varying optimizers and learning rates, I would like these hyperparameters to show up in tensorboard, but I don't know how to make that happen. Using save_hyperparameters()
in the __init__()
of my module doesn't work, because the optimizer and learning rate are not parameters to __init_()
. As I understand it, the optimizer is patched into the module class by LightningCLI. Can I make LightningCLI log these as hyperparameters?
Btw, the settings are saved correctly in config.yaml, they just don't show up in hparams.yaml.
Update: I found a workaround; I added optimizer
and lr
as arguments to __init__()
even if I don't use them there. They get saved by save_hyperparameters()
and show up in the Tensorboard logs.
It would be nice if LightningCLI
could automate this, without the need for "hacks" in the model class. In my usecase, I specify the optimizer classpath and its init args directly in a yaml configuration file that I pass to the cli.