You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/learning-paths/laptops-and-desktops/win_on_arm_build_onnxruntime/2-build-onnxruntime.md
+14-9Lines changed: 14 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,17 +9,19 @@ layout: learningpathall
9
9
## Build ONNX Runtime for Windows on Arm
10
10
Now that your environment is set up, you're ready to build the ONNX Runtime inference engine.
11
11
12
-
ONNX Runtime is an open-source inference engine for accelerating the deployment of machine learning models, particularly those in the Open Neural Network Exchange (ONNX) format. ONNX Runtime is optimized for high performance and low latency, widely used in the production deployment of AI models.
12
+
ONNX Runtime is an open-source engine for accelerating machine learning model inference, especially those in the Open Neural Network Exchange (ONNX) format.
13
+
14
+
ONNX Runtime is optimized for high performance and low latency, and is widely used in production deployments.
13
15
14
16
{{% notice Learning Tip %}}
15
17
You can learn more about ONNX Runtime by reading the [ONNX Runtime Overview](https://onnxruntime.ai/).
16
18
{{% /notice %}}
17
19
18
20
### Clone the ONNX Runtime repository
19
21
20
-
Open a developer command prompt for Visual Studio to set up the environment including path to compiler, linker, utilities and header files.
22
+
Open a command prompt for Visual Studio to set up the environment. This includes paths to the compiler, linker, utilities, and header files.
21
23
22
-
Create your workspace and check out the source tree:
24
+
Then, create your workspace and clone the repository:
0 commit comments