load_from_checkpoint has different validation results #13482
Unanswered
lanlanlan3
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 6 replies
-
try: trainer.validate(model, ckpt_path=ckpt_path) |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
when save ckpt, it has some metrics like val_acc=0.77.
but when load_from_checkpoint, trainer.validate(model), I got val_acc=0.02
code:
from main_pl import LitModel
ckpt_path = '...val_acc=0.77.ckpt'
model = LitModel.load_from_checkpoint(ckpt_path)
'''model.eval()'''
trainer = Trainer(gpus=-1)
'''or trainer = Trainer(gpus=-1, resume_from_checkpoint=ckpt_path)'''
trainer.validate(model) # val_acc=0.02
Beta Was this translation helpful? Give feedback.
All reactions