Skip to content

Commit f3dada6

Browse files
committed
Update inference.md
1 parent b59b1cf commit f3dada6

File tree

1 file changed

+16
-10
lines changed
  • content/learning-paths/laptops-and-desktops/win_python_onnx

1 file changed

+16
-10
lines changed

content/learning-paths/laptops-and-desktops/win_python_onnx/inference.md

Lines changed: 16 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -11,41 +11,47 @@ layout: "learningpathall"
1111
You will now use the implemented code to run inference.
1212

1313
## Packages
14-
Start by activating the virtual environment and installing Python packages:
14+
Start by activating the virtual environment and installing the necessary Python packages. Activate the virtual environment:
15+
1516
```console
1617
venv-x64\Scripts\activate.bat
1718
```
1819

19-
Then, type:
20+
Then install required packages:
2021
```console
2122
py -V:3.13 -m pip install onnxruntime numpy matplotlib wget torchvision torch
2223
```
2324

2425
## Running Inference
25-
To run the inference type the following
26+
To perform inference, run the following command:
27+
2628
```console
2729
py -V:3.13 .\main.py
2830
```
2931

30-
The code will display the sample inference result like shown below:
31-
32-
If you close the image, you will see the computation time:
32+
The code will display a sample inference result similar to the image below:
33+
![fig1](figures/01.png)
34+
Upon closing the displayed image, the script will output the computation time:
3335
```output
3436
PS C:\Users\db\onnx> py -V:3.13 .\main.py
3537
Computation time: 95.854 ms
3638
PS C:\Users\db\onnx> py -V:3.13 .\main.py
3739
Computation time: 111.230 ms
3840
```
39-
![fig1](figures/01.png)
4041

41-
To compare these with Windows Arm 64, repeat the following for the Arm-64 Python architecture:
42+
43+
To compare results with Windows Arm 64, repeat the steps below using the Arm-64 Python architecture. Activate the Arm64 virtual environment and install packages:
4244
```console
4345
venv-arm64\Scripts\activate.bat
4446
py -V:3.13-arm64 -m pip install onnxruntime numpy matplotlib wget torchvision torch
47+
```
48+
49+
Run inference using Arm64:
50+
```console
4551
py -V:3.13-arm64 main.py
4652
```
4753

48-
Note the above will work, when the onnx runtime will become available for Windows on Arm 64.
54+
Note: The above Arm64 commands will function properly once ONNX Runtime becomes available for Windows Arm 64.
4955

5056
## Summary
51-
57+
In this learning path, you’ve learned how to use ONNX Runtime to perform inference on the MNIST dataset. You prepared your environment, implemented the necessary Python code, and measured the performance of your inference tasks.

0 commit comments

Comments
 (0)