Replies: 2 comments
-
GPU support through ORT QNN-EP was enabled in preview mode at Build. This is a very initial enablement, and we are expecting to get more workloads to be enabled through GPU in the coming months. Once it is enabled, it should work across Android/Windows. |
Beta Was this translation helpful? Give feedback.
0 replies
-
To add some more information to Ashish's response,
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi ONNX Runtime team,
I have devices with Snapdragon 8 Gen 2 and Gen 3 SoCs, which both feature Adreno GPUs.
I would like to ask whether ONNX Runtime supports Adreno GPU acceleration for model inference across different operating systems — specifically Android, Windows, and Linux.
My questions are:
Any insights or official stance would be greatly appreciated.
Thank you in advance!
Best regards,
Beta Was this translation helpful? Give feedback.
All reactions