You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/learning-paths/servers-and-cloud-computing/onnx-on-azure/deploy.md
+20-9Lines changed: 20 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ layout: learningpathall
8
8
9
9
10
10
## ONNX Installation on Azure Ubuntu Pro 24.04 LTS
11
-
Install Python, create a virtual environment, and use pip to install ONNX, ONNX Runtime, and dependencies. Verify the setup and validate a sample ONNX model like SqueezeNet.
11
+
To work with ONNX models on Azure, you will need a clean Python environment with the required packages. The following steps install Python, set up a virtual environment, and prepare for ONNX model execution using ONNX Runtime.
12
12
13
13
### Install Python and Virtual Environment:
14
14
@@ -26,41 +26,46 @@ source onnx-env/bin/activate
26
26
27
27
### Install ONNX and Required Libraries:
28
28
29
+
Upgrade pip and install ONNX with its runtime and supporting libraries:
### Download and Validate ONNX Model - SqueezeNet:
56
-
SqueezeNet is a lightweight convolutional neural network (CNN) architecture designed to achieve comparable accuracy to AlexNet, but with fewer parameters and smaller model size.
57
+
With this validation, you have confirmed that ONNX and ONNX Runtime are installed and ready on your Azure Cobalt 100 VM. This is the foundation for running inference workloads and serving ONNX models.
57
58
59
+
### Download and Validate ONNX Model - SqueezeNet:
60
+
SqueezeNet is a lightweight convolutional neural network (CNN) architecture designed to provide accuracy close to AlexNet while using 50x fewer parameters and a much smaller model size. This makes it well-suited for benchmarking ONNX Runtime.
Create a **vaildation.py** file with the code below for validation for ONNX model:
67
+
After downloading the SqueezeNet ONNX model, the next step is to confirm that it is structurally valid and compliant with the ONNX specification. ONNX provides a built-in checker utility that verifies the graph, operators, and metadata.
68
+
Create a file named `validation.py` with the following code:
64
69
65
70
```python
66
71
import onnx
@@ -69,10 +74,16 @@ model = onnx.load("squeezenet-int8.onnx")
69
74
onnx.checker.check_model(model)
70
75
print("✅ Model is valid!")
71
76
```
72
-
You should see an output similar to:
77
+
Run the script:
78
+
79
+
```bash
80
+
python3 validation.py
81
+
```
82
+
83
+
You should see output similar to:
73
84
```output
74
85
✅ Model is valid!
75
86
```
76
-
This downloads a quantized (INT8) classification model, and validates its structure using ONNX’s built-in checker.
87
+
With this validation, you have confirmed that the quantized SqueezeNet model is valid and ONNX-compliant. The next step is to run inference with ONNX Runtime and to benchmark performance.
77
88
78
89
ONNX installation and model validation are complete. You can now proceed with the baseline testing.
0 commit comments