Skip to content

Commit 389641d

Browse files
Further updates
1 parent e28610c commit 389641d

File tree

1 file changed

+2
-2
lines changed
  • content/learning-paths/laptops-and-desktops/win_on_arm_build_onnxruntime

1 file changed

+2
-2
lines changed

content/learning-paths/laptops-and-desktops/win_on_arm_build_onnxruntime/1-dev-env-setup.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,9 +6,9 @@ weight: 2
66
layout: learningpathall
77
---
88

9-
## Set up your development environment
9+
## Overview
1010

11-
In this learning path, you'll learn how to build and deploy an LLM on a Windows on Arm (WoA) laptop using ONNX Runtime for inference.
11+
In this Learning Path, you'll learn build and deploy a large language model (LLM) on a Windows on Arm (WoA) laptop using ONNX Runtime for inference.
1212

1313
You'll first learn how to build the ONNX Runtime and ONNX Runtime Generate() API library and then how to download the Phi-3 model and run the inference. You'll run the short context (4k) mini (3.3B) variant of Phi 3 model. The short context version accepts a shorter (4K) prompts and produces shorter output text compared to the long (128K) context version. The short version consumes less memory.
1414

0 commit comments

Comments
 (0)