Why are optimizers set up inside Module
s rather than in Trainer
s?
#15414
Unanswered
augustebaum
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 1 reply
-
IMHO, the optimizer is related to the model architecture, e.g. transformers and non-transformers architectures may require different optimizers. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I hesitated to post this in the issue board with the
design
tag.It feels strange to attach an optimizer to a
Module
and, e.g. pass the learning rate to theModule
at initialisation, rather than pass the desired optimizer to theTrainer
at training time. Can anyone explain the current approach, or direct me to some other resources?The reason I ask is because I am currently migrating my research project on auto-encoders from
pythae
tolightning
and inpythae
the separation seems more logical to me.Beta Was this translation helpful? Give feedback.
All reactions