Save a model but without all parameters #14206
Unanswered
aRI0U
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
hi @aRI0U, you can save your classifier module separately. Please check the example below- class ClassifierModule(pl.LightningModule):
def __init__(self):
super().__init__()
self.encoder = EncoderModule() # this is a PL module
self.lr = 1e-3
self.save_hyperparameters()
self.criterion = torch.nn.MSELoss()
self.classifier = LinearModel()
def forward(self, x):
return self.classifier(self.encoder(x))
# save the classifier only
model = ClassifierModule()
torch.save("classifier.pt", model.classifier) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm working on a representation learning project and I evaluate my models with classification downstream tasks. My classifier is a
LightningModule
which takes as argument a trained model (referred as encoder) with frozen parameters and then trains a linear model to classify from the outputs of this encoder. The encoder itself is also aLightningModule
whose parameters are already stored in a checkpoint.How can I save the parameters of the linear classifier without keeping all the params of the encoder?
In fact, since the encoder is huge and already stored somewhere, it's a big lack of space to store the parameters again in the checkpoint of the linear classifier. However, I still would like to be able to load the linear classifier from its checkpoint, maybe by storing the path to the encoder checkpoint in the linear classifier checkpoint or something like that. Do you have an idea how to achieve this?
Thanks a lot!
Beta Was this translation helpful? Give feedback.
All reactions