Access best checkpoint when using early stopping #14931
Answered
by
celsofranssa
vikigenius
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
I use the early stopping callback like this trainer = pl.Trainer(
strategy=strategy,
accelerator=accelerator,
devices=devices,
deterministic=True,
max_epochs=max_epochs,
callbacks=[EarlyStopping(monitor="rougeL_fmeasure", mode="max", patience=5)],
) But this just creates a single checkpoint like this |
Beta Was this translation helpful? Give feedback.
Answered by
celsofranssa
Oct 8, 2022
Replies: 1 comment
-
Hello @vikigenius, I usually specify how my PL model checkpoints, as shown in the code snippet below: trainer = pl.Trainer(
[...] # remain trainer params
callbacks=[
self.get_model_checkpoint_callback(), # checkpoint_callback
self.get_early_stopping_callback(), # early_stopping_callback
[...] # other callbacks
],
deterministic=True
)
def get_model_checkpoint_callback(self):
return ModelCheckpoint(
monitor=self.params.val_metric,
dirpath=self.params.model_checkpoint.dir,
filename=f"{self.params.model.name}_{self.params.data.name}_{self.params.fold}",
save_top_k=self.params.save_top_k, # the best k models according to the monitor
save_weights_only=self.params.save_weights_only,
mode=self.params.mode
)
def get_early_stopping_callback(self,):
return EarlyStopping(
monitor=self.params.val_metric,
patience=self.params.trainer.patience,
min_delta=self.params.trainer.min_delta,
mode=self.params.mode
) I hope this could help you. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
vikigenius
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello @vikigenius,
I usually specify how my PL model checkpoints, as shown in the code snippet below: