How to Properly Run Inference with a Fine-Tuned Model in PaddlePaddle 3.0? #15380
-
Hi everyone! I'm currently working with PaddlePaddle 3.0 and have fine-tuned a model (PP-OCRv3) on my custom dataset. The evaluation accuracy during training is good (~99%), but I'm facing issues when running inference — the model doesn't predict correctly, and results are inconsistent with the validation set. Could anyone point me to a reliable tutorial or example focused specifically on using PaddlePaddle 3.0 for inference (especially with inference.json / IR model formats)? Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
This issue should not be related to PaddlePaddle 3.0.0. You should ensure that the model you exported is correct, meaning you have correctly loaded the dynamic graph weights and dictionary. Could you provide more detailed operation records? |
Beta Was this translation helpful? Give feedback.
This issue should not be related to PaddlePaddle 3.0.0. You should ensure that the model you exported is correct, meaning you have correctly loaded the dynamic graph weights and dictionary. Could you provide more detailed operation records?