Support for Execution Providers - QNN #660
Replies: 6 comments 6 replies
-
Hi @adhurwit, we are working on this! Stay tuned. |
Beta Was this translation helpful? Give feedback.
-
Moving into Discussion |
Beta Was this translation helpful? Give feedback.
-
Hi there. Do you have ETA for QNN EP? Thanks! |
Beta Was this translation helpful? Give feedback.
-
Is there anything I can do to help? I can't even find instructions on how to build genai with QNN support enabled. I already have onnxruntime built locally for QNN. |
Beta Was this translation helpful? Give feedback.
-
there were a branch named phi3.5_qnn_test / android |
Beta Was this translation helpful? Give feedback.
-
Is there any update on this? Are there plans to support context binaries/split models for HTP to get around HTP rpc memory size limitations? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
In order to make use of Qualcomm's NPU, we need to use the ONNX EP for QNN. There does not appear to be a way to specify that right now. Please add support for Qualcomm NPUs
Beta Was this translation helpful? Give feedback.
All reactions