Exclude specific nn.Parameter from checkpoint saving #16409
-
I have certain parameters registered as nn.Parameter (inside an nn.Module inside a LightningModule) that I would like to be excluded from the checkpoint. How can I accomplish this in Lightning? I realize I could avoid using nn.Parameter and simply place the tensor on the correct device. However, letting Lightning handle the device placement is much easier in this context, and it also records their size for the model summary. Most of the time these are non-learnable paramters, but in some cases I want to choose whether or not to save learnable parameters in the checkpoint. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I figured out how to accomplish this by deleting the Parameters in question from the state dict inside |
Beta Was this translation helpful? Give feedback.
I figured out how to accomplish this by deleting the Parameters in question from the state dict inside
LightningModule.on_save_checkpoint()
.