You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: c_cxx/accuracy_tool/README.md
+77-9Lines changed: 77 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,10 +2,22 @@
2
2
This tool measures the accuracy of a set of models on a given execution provider. The accuracy is computed by comparing with the expected results, which are either loaded from file or attained by running the model with the CPU execution provider.
3
3
4
4
## Build instructions on Windows
5
-
Run the following commands in a terminal to generate a Visual Studio project and compile the tool. Make sure to specify the location of your ONNX Runtime installation. You can either [download an ONNX Runtime release package](https://github.com/microsoft/onnxruntime/releases/) or you can [build ONNX Runtime from source](https://www.onnxruntime.ai/docs/build/).
5
+
### Using an ONNX Runtime NuGet package
6
+
Download an ONNX Runtime NuGet package with the desired execution provider(s):
Run the following command to open the solution file with Visual Studio.
@@ -20,20 +32,76 @@ Alternatively, you can directly run the executable from the terminal:
20
32
.\build\Release\accuracy_test.exe --help
21
33
```
22
34
23
-
### Building with QNN execution provider
24
-
To test model accuracy with the QNN execution provider, provide the path to location of your Qualcomm AI Engine Direct SDK (QNN SDK) and, optionally, the hexagon architecture version (e.g., 68 or 73).
25
-
The QNN SDK can be downloaded from https://qpm.qualcomm.com/main/tools/details/qualcomm_ai_engine_direct.
26
-
Providing the QNN SDK path will ensure that the appropriate QNN SDK dynamic libraries (e.g., QnnHtp.dll) are automatically copied to the build directory.
35
+
### Using an ONNX Runtime source build
36
+
#### Build ONNX Runtime from source
37
+
Refer to the documentation for [building ONNX Runtime from source](https://www.onnxruntime.ai/docs/build/) with the desired execution providers.
38
+
39
+
The following commands build ONNX Runtime from source with the CPU EP.
Download an ONNX Runtime release package from https://github.com/microsoft/onnxruntime/releases/ and extract it to your desired installation directory (`<ORT_INSTALL_DIR>`).
83
+
84
+
Clone this onnxruntime-inference-examples repository:
0 commit comments