How to handle pretrained models without training them #8778
Answered
by
staniPetrox
staniPetrox
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Hey gang! I have written an Encoder model and a decoder model and I want to train them separately.
However, when I give my Decoder an Encoder hyperparameter, how do I make sure it will not be trained? Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Answered by
staniPetrox
Aug 6, 2021
Replies: 1 comment
-
Ok, I found out from other forums that one should use |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
staniPetrox
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Ok, I found out from other forums that one should use
.freeze()
, in this case:self.encoder_model.freeze()