I am using onnxruntime to infer the onnx model of float16 type, the following error occurs, how should I solve it[Build] #17210
Unanswered
1615070057
asked this question in
General
Replies: 2 comments 9 replies
-
It doesn't like a build issue. Please use onnx model checker to check your model first. |
Beta Was this translation helpful? Give feedback.
8 replies
-
I have the same issue.
If inference is done with FP32, there is no problem.
If inference is done with an FP16 model, an error is reported.
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the issue
I am using onnxruntime to infer the onnx model of float16 type, the following error occurs, how should I solve it[Build]
error:
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from model_fp16.onnx failed:Node (Resize__139) Op (Resize) [ShapeInferenceError] Either
sizes
orscales
must be provided, but not both of themUrgency
The project is urgent, the deadline is August 23
Target platform
linux 64
Build script
import onnx
from onnxconverter_common import float16
model = onnx.load("model.onnx")
model_fp16 = float16.convert_float_to_float16(model)
onnx.save(model_fp16, "model_fp16.onnx", size_threshold=2048)
Error / output
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from model_fp16.onnx failed:Node (Resize__139) Op (Resize) [ShapeInferenceError] Either
sizes
orscales
must be provided, but not both of themVisual Studio Version
No response
GCC / Compiler Version
No response
Beta Was this translation helpful? Give feedback.
All reactions