How to log optimizer and learning rate when set by LightningCLI #11258
Unanswered
slinnarsson
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
To have some custom logging of hyperparameters I would recommend to extend |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using LightningCLI and setting the optimizer and learning rate using args like
--optimizer AdamW --optimizer.lr 0.01
. When running multiple runs with varying optimizers and learning rates, I would like these hyperparameters to show up in tensorboard, but I don't know how to make that happen. Usingsave_hyperparameters()
in the__init__()
of my module doesn't work, because the optimizer and learning rate are not parameters to__init_()
. As I understand it, the optimizer is patched into the module class by LightningCLI. Can I make LightningCLI log these as hyperparameters?Btw, the settings are saved correctly in config.yaml, they just don't show up in hparams.yaml.
Update: I found a workaround; I added
optimizer
andlr
as arguments to__init__()
even if I don't use them there. They get saved bysave_hyperparameters()
and show up in the Tensorboard logs.Beta Was this translation helpful? Give feedback.
All reactions