How to reset callbacks when calling "load_from_checkpoint" #13013
Unanswered
jannisborn
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Say I'm loading pretrained model using
load_from_checkpoint(ckpt_path)
. If I used callbacks during the pretraining, the callback values will be restored (e.g., the minimal loss).This is very useful if one continues to train on the same data but it is prohibitive if the model is finetuned on another dataset.
I'm looking for an option to reset all callback values. Is this supported?
I know that I can overwrite hyperparameters (the docs are pretty verbose about it) but that's not what I want.
Thanks for any advice!
Beta Was this translation helpful? Give feedback.
All reactions