Why is trainer.init_optimizers
called in so many places?
#10173
Unanswered
daniellepintz
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 2 replies
-
1 and 3 seems ok since it's just a sequential call. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I see we call
trainer.init_optimizers
in many places in the code. I am wondering why don't we just call it once, when we initialize the trainer, and then for the existing calls instead just accesstrainer.optimizers
,trainer.lr_schedulers
,trainer.optimizer_frequencies
?It is defined in the
TrainerOptimizersMixin
:https://github.com/PyTorchLightning/pytorch-lightning/blob/c33df2639f19d49c5e7520294e3221efe402d684/pytorch_lightning/trainer/optimizers.py#L32
It is called in these places:
https://github.com/PyTorchLightning/pytorch-lightning/blob/c33df2639f19d49c5e7520294e3221efe402d684/pytorch_lightning/plugins/training_type/deepspeed.py#L458
https://github.com/PyTorchLightning/pytorch-lightning/blob/c33df2639f19d49c5e7520294e3221efe402d684/pytorch_lightning/plugins/training_type/training_type_plugin.py#L248
https://github.com/PyTorchLightning/pytorch-lightning/blob/c33df2639f19d49c5e7520294e3221efe402d684/pytorch_lightning/tuner/lr_finder.py#L109
https://github.com/PyTorchLightning/pytorch-lightning/blob/c33df2639f19d49c5e7520294e3221efe402d684/tests/helpers/pipelines.py#L87
Beta Was this translation helpful? Give feedback.
All reactions