What does model checkpoint do under the hood? #12615
Unanswered
FeryET
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
possibly yes, since it relies on optimizer.state_dict() to retrieve the states and I believe if any layer is freezer then optimizer doesn't track the state for it and thus it's saved at all.
optimizer states are also saved: https://pytorch-lightning.readthedocs.io/en/latest/common/checkpointing.html#checkpoint-contents |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi.
I'm trying to implement a GAN loop, and after restoring model from a checkpoint, I'm seeing a complete discripency between before checkpoint losses and after checkpoint losses. I'm suspecting two things.
Beta Was this translation helpful? Give feedback.
All reactions