Skip to content

Commit 1233b55

Browse files
Updates
1 parent 19df8b6 commit 1233b55

File tree

1 file changed

+11
-9
lines changed

1 file changed

+11
-9
lines changed

content/learning-paths/laptops-and-desktops/win_on_arm_build_onnxruntime/4-run-benchmark-on-WoA.md

Lines changed: 11 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -34,27 +34,29 @@ git lfs install
3434
```
3535
If you don’t have winget, [download the installer manually](https://docs.github.com/en/repositories/working-with-files/managing-large-files/installing-git-large-file-storage?platform=windows).
3636

37-
If the extension is already installed for you when you run the above ``git`` command it will say ``Git LFS initialized``.
37+
If Git LFS is already installed, you'll see ``Git LFS initialized``.
3838

39-
You then need to install the ``HuggingFace CLI``
39+
### Install Hugging Face CLI
40+
41+
You then need to install the ``HuggingFace CLI``:
4042
``` bash
4143
pip install huggingface-hub[cli]
4244
```
4345

44-
### Download the Phi-3-Mini-4K model
46+
### Download the Phi-3-Mini (4K) model
4547

4648
``` bash
4749
cd C:\Users\%USERNAME%
4850
cd repos\lp
4951
huggingface-cli download microsoft/Phi-3-mini-4k-instruct-onnx --include cpu_and_mobile/cpu-int4-rtn-block-32-acc-level-4/* --local-dir .
5052
```
51-
This command downloads the model into a folder called `cpu_and_mobile`.
53+
This command downloads the model into a folder named `cpu_and_mobile`.
5254

53-
### Build model runner (ONNX Runtime GenAI C Example)
55+
### Build the Model Runner (ONNX Runtime GenAI C Example)
5456

55-
In the previous section you built ONNX Runtime Generate() API from source. The headers and dynamic linked libraries that are built need to be copied over to appropriate folders (``lib`` and ``inclue``).
57+
In the previous step, you built the ONNX Runtime Generate() API from source. Now, copy over the resulting headers and dynamic linked libraries into the appropriate folders (``lib`` and ``include``).
5658

57-
Building from source is a better practice because the examples usually are updated to run with the latest changes.
59+
Building from source is a better practice because the examples usually are updated to run with the latest changes:
5860

5961
``` bash
6062
copy onnxruntime\build\Windows\Release\Release\onnxruntime.* onnxruntime-genai\examples\c\lib
@@ -73,15 +75,15 @@ cd build
7375
cmake --build . --config Release
7476
```
7577

76-
After a successful build, a binary program called `phi3` will be created in the ''onnxruntime-genai'' folder:
78+
After a successful build, the binary `phi3` will be created in the ''onnxruntime-genai'' folder:
7779

7880
```output
7981
dir Release\phi3.exe
8082
```
8183

8284
#### Run the model
8385

84-
Use the runner you just built to execute the model with the following commands:
86+
Execute the model using the following command:
8587

8688
``` bash
8789
cd C:\Users\%USERNAME%

0 commit comments

Comments
 (0)