Saving a model checkpoint file with updated parameters #13354
Unanswered
sunburntfish
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What's the correct way to update a saved checkpoint file if I change the hyperparameters? For example if I have something like this:
I can then reload the checkpoint no problem with:
But if I change my constructor:
the call to load from checkpoint will complain that it is missing the
param2
from parameter list.I can load it if I pass the parameter at load time:
What I want at this point is a method that just saves the updated model with the new parameter to a new checkpoint file, e.g.:
But it seems the only way to save the checkpoint is through using the Trainer after the model has been internally coupled with the trainer. This is really awkward as I don't want to call
trainer.fit(model)
and if I usetrainer.test(model)
and save afterwards withtrainer.save_checkpoint(checkpoint_file)
the file is much smaller than my original checkpoint (I think because it only saves the weights?).What's the right way to update the model checkpoint file if I change the model parameters?
Beta Was this translation helpful? Give feedback.
All reactions