You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The code will display the sample inference result like shown below:
31
-
32
-
If you close the image, you will see the computation time:
32
+
The code will display a sample inference result similar to the image below:
33
+

34
+
Upon closing the displayed image, the script will output the computation time:
33
35
```output
34
36
PS C:\Users\db\onnx> py -V:3.13 .\main.py
35
37
Computation time: 95.854 ms
36
38
PS C:\Users\db\onnx> py -V:3.13 .\main.py
37
39
Computation time: 111.230 ms
38
40
```
39
-

40
41
41
-
To compare these with Windows Arm 64, repeat the following for the Arm-64 Python architecture:
42
+
43
+
To compare results with Windows Arm 64, repeat the steps below using the Arm-64 Python architecture. Activate the Arm64 virtual environment and install packages:
Note the above will work, when the onnx runtime will become available for Windows on Arm 64.
54
+
Note: The above Arm64 commands will function properly once ONNX Runtime becomes available for Windows Arm 64.
49
55
50
56
## Summary
51
-
57
+
In this learning path, you’ve learned how to use ONNX Runtime to perform inference on the MNIST dataset. You prepared your environment, implemented the necessary Python code, and measured the performance of your inference tasks.
0 commit comments