Placing a condition on ModelCheckpoint Callback #13337
Answered
by
rohitgr7
skrish13
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
For example: How to save checkpoints ONLY when valset accuracy is greater than some specific threshold? If this is not possible using parameters in ModelCheckpoint callback, how can we extend ModelCheckpoint to achieve this? |
Beta Was this translation helpful? Give feedback.
Answered by
rohitgr7
Jun 21, 2022
Replies: 1 comment
-
you can do: class CustomModelCheckpoint(ModelCheckpoint):
def on_validation_end(self, trainer, pl_module):
score = trainer.callback_metrics[self.monitor]
if score > some_threshold:
super().on_validation_end(trainer, pl_module)
trainer = Trainer(callbacks=[CustomModelCheckpoint(...)]) |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
skrish13
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
you can do: