【急急急】export_model.py转出的模型无法在cppinfer中使用,很费解 #12975
Unanswered
hanliangwei
asked this question in
Q&A
Replies: 2 comments
-
paddlepaddle-gpu 2.6.0.post116 |
Beta Was this translation helpful? Give feedback.
0 replies
-
又尝试了在ubuntu上进行转换,转出的.pdmodel依然比较小 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
在windows下cmake编译了cpp_infer,然后下载上图中的推理模型,是可以正常使用的

然后下载上图中的训练模型,并直接使用命令
python tools/export_model.py -c configs/rec/PP-OCRv3/ch_PP-OCRv3_rec_distillation.yml -o Global.pretrained_model=./ckpt/best_accuracy Global.save_inference_dir=./inference_model/ch_PP-OCRv3_rec/
推理出来后,在python中是可以推理的,但是在cpp中推理就会crash,crash在paddle_inference.dll里的识别接口中。
同样也是.pdmodel的文件比较小,请问该如何处理呢
Beta Was this translation helpful? Give feedback.
All reactions