Does mmdeploy supports onnx model quantization(onnx model with int8 mode)? #2489
Unanswered
XiaohuJoshua
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How can I implement int8 quantization for ONNX. Thanks
Beta Was this translation helpful? Give feedback.
All reactions