Replies: 1 comment
-
Hello, you can use the OpenVINO inference script for making inferences with an ONNX model. The script can be found here: https://github.com/openvinotoolkit/anomalib/blob/main/tools/inference/openvino_inference.py |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What is the motivation for this task?
I have successed review the code, and export model to onnx, now I want to inference this model in onnx type
Describe the solution you'd like
Does have the command to inference the model with onnx type
Additional context
No response
Beta Was this translation helpful? Give feedback.
All reactions