Fast Yolov3 CPU Inference Using Onnxruntime #6521
matankley
started this conversation in
Show & Tell
Replies: 2 comments 1 reply
-
Thank you for sharing! 👍 Was this primarily for research purposes or are you using this solution in some production scenarios? |
Beta Was this translation helpful? Give feedback.
1 reply
-
Is there any comparison for yolov7 ONNX models? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I would like to thank you and share with you few insights i observed while testing inference of Yolov3 Object detection model using Onnxruntime on CPU.
Onnxruntime inference is significantly faster than the common opencv-dnn and darknet that are usually used for yolov3 models.
Here is a link to the full article - (https://medium.com/towards-artificial-intelligence/yolov3-cpu-inference-performance-comparison-onnx-opencv-darknet-6764f2bde33e)
Hope this can help someone and show the great value we can earn from onnxruntime.
Thanks !
Beta Was this translation helpful? Give feedback.
All reactions