How to checkpoint submodules #7768
Unanswered
dtch1997
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 3 replies
-
Hi, to do this, you probably have to provide your own ModelCheckpoint Callback |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
By default pytorch lightning Trainer saves checkpoint for the full model. However I would like to checkpoint manually for individual submodules.
Eg. I might have an image classifier that predicts labels and a image caption generator that generates captions from images; both of them use the same CNN encoder. I might want to do something like pre-train with the classification task (saving checkpoints) and then load only the trained encoder (minus the prediction head) for use in the downstream caption task. Furthermore I might have several backbones that I want to test out for the encoder/
What's the idiomatic way to do this in PyTorch Lightning?
Beta Was this translation helpful? Give feedback.
All reactions