Skip to content

Error with loading model checkpoint #12399

Discussion options

You must be logged in to vote

hey @dmandair !

did you call self.save_hyperparameters() inside your LM.__init__? else hyperparameters won't be saved inside the checkpoint and you might need to provide them again using LMModel.load_from_checkpoint(..., encoder=encoder, encoder_out_dim=encoder_out_dim, ...).

also note that, if you are passing an nn.Module inside your LM and calling self.save_hyperparameters(), it will save that too inside your hparams, which is not a good thing considering that nn.Modules are saved inside checkpoint state_dict and might create issues for you. Ideally, you should ignore them using self.save_hyperparameters(ignore=['encoder']). Check out this PR: #12068

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@dmandair
Comment options

@rohitgr7
Comment options

@dmandair
Comment options

Answer selected by rohitgr7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment