Skip to content

Commit f5e3053

Browse files
Update onnxwrappger samples README
1 parent 97045b2 commit f5e3053

File tree

7 files changed

+84
-68
lines changed

7 files changed

+84
-68
lines changed

tools/onnxwrapper/README.md

Lines changed: 5 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -17,23 +17,18 @@ This tool provides a transparent runtime replacement that allows existing **ONNX
1717
- Rapid validation of QNN models generated from ONNX or downloaded from **AI Hub**.
1818

1919
## Usage Overview
20-
There are two usages for this tool:
21-
**Usage 1:**
22-
1. Copy `onnxexec.py` and `onnxwrapper.py` into your ONNX sample code directory.
23-
2. Run your original ONNX inference script via:
24-
25-
```bash
26-
python onnxexec.py your_onnx_sample.py
27-
```
28-
**Usage 2:**
2920
Import the wrapper explicitly at the beginning of your script:
3021

3122
```python
3223
from qai_appbuilder import onnxwrapper
3324
```
34-
3525
No other changes to your ONNX inference code are required.
3626

27+
```bash
28+
python your_onnx_sample.py
29+
```
30+
Note: If your onnx sample needs some options, please add them when running above command.
31+
3732
## Technical Implementation
3833

3934
### Overall Architecture

tools/onnxwrapper/Samples/facemap_3dmm_onnx/README.md

Lines changed: 13 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,21 +4,23 @@
44
This is onnx sample code for using QAI AppBuilder to load facemap_3dmm QNN model to HTP and execute 3D face reconstruction inference to generate parametric 3D face models from 2D facial images. The facemap_3dmm_onnx_infer.py is onnx sample code.
55

66
## Run the sample code
7-
8-
```
97
If you want to run the sample code with onnx models.
8+
```bash
109
python prepare_facemap_3dmm_onnx_models.py
1110
python facemap_3dmm_onnx_infer.py --model models-onnx\facemap_3dmm-onnx-float\facemap_3dmm.onnx --image input.jpg
12-
11+
```
1312
If you want to run the sample code with qnn models.
14-
python prepare_facemap_3dmm_qnn_models.py
15-
python onnxexec.py facemap_3dmm_onnx_infer.py --model models-qnn\facemap_3dmm.bin --image input.jpg
16-
17-
You also can add the following code at beginning of facemap_3dmm_onnx_infer.py.
18-
from qai_appbuilder import onnxwrapper
19-
Then run the following command.
20-
python facemap_3dmm_onnx_infer.py --model models-qnn\facemap_3dmm.bin --image input.jpg
21-
13+
1. Run the following command.
14+
```bash
15+
python prepare_facemap_3dmm_qnn_models.py
16+
```
17+
2. Add the following code at beginning of facemap_3dmm_onnx_infer.py.
18+
```python
19+
from qai_appbuilder import onnxwrapper
20+
```
21+
3. Then run the following command.
22+
```bash
23+
python facemap_3dmm_onnx_infer.py --model models-qnn\facemap_3dmm.bin --image input.jpg
2224
```
2325
## Output
2426
You can see output.jpg in out folder.

tools/onnxwrapper/Samples/real_esrgan_x4plus_onnx/README.md

Lines changed: 13 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,20 +4,24 @@
44
This is onnx sample code for using AppBuilder to load real_esrgan_x4plus QNN model to HTP and execute inference to generate image.
55

66
## Run the sample code
7-
```
87
If you want to run the sample code with onnx models.
8+
```bash
99
python prepare_real_esrgan_x4plus_onnx_models.py
1010
python real_esrgan_x4plus_onnx_inference.py --model models-onnx\real_esrgan_x4plus-onnx-float\real_esrgan_x4plus.onnx
11+
```
1112

1213
If you want to run the sample code with qnn models.
13-
python prepare_real_esrgan_x4plus_qnn_models.py
14-
python onnxexec.py real_esrgan_x4plus_onnx_inference.py --model models-qnn\real_esrgan_x4plus.bin --tile 512
15-
16-
You also can add the following code at beginning of real_esrgan_x4plus_onnx_inference.py.
17-
from qai_appbuilder import onnxwrapper
18-
Then run the following command.
19-
python real_esrgan_x4plus_onnx_inference.py --model models-qnn\real_esrgan_x4plus.bin --tile 512
20-
14+
1. Run the following command.
15+
```bash
16+
python prepare_real_esrgan_x4plus_qnn_models.py
17+
```
18+
2. Add the following code at beginning of real_esrgan_x4plus_onnx_inference.py.
19+
```python
20+
from qai_appbuilder import onnxwrapper
21+
```
22+
3. Then run the following command.
23+
```bash
24+
python real_esrgan_x4plus_onnx_inference.py --model models-qnn\real_esrgan_x4plus.bin --tile 512
2125
```
2226
## Output
2327
The output image will be saved to output_x4.png

tools/onnxwrapper/Samples/stable_diffusion_1_5_onnx/README.md

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -4,20 +4,23 @@
44
This is sample code for using AppBuilder to load Stable Diffusion 1.5 QNN models to HTP and execute inference to generate image.
55

66
## Run the sample code
7-
```
87
If you want to run the sample code with onnx models.
8+
```bash
99
python prepeare_stable_diffusion_onnx_models.py
1010
python stable_diffusion_1_5_onnx_infer.py --model_root models-onnx\modularai_stable-diffusion-1-5-onnx --provider cpu --out sd15_out.png
11-
11+
```
1212
If you want to run the sample code with qnn models.
13-
python prepeare_stable_diffusion_qnn_models.py
14-
python onnxexec.py stable_diffusion_1_5_onnx_infer.py --model_root models-qnn --vae_scale 1.0
15-
16-
You also can add the following code at beginning of stable_diffusion_1_5_onnx_infer.py.
17-
from qai_appbuilder import onnxwrapper
18-
Then run the following command.
19-
python stable_diffusion_1_5_onnx_infer.py --model_root models-qnn --vae_scale 1.0
20-
13+
1. Run the following command.
14+
```bash
15+
python prepeare_stable_diffusion_qnn_models.py
16+
```
17+
2. Add the following code at beginning of stable_diffusion_1_5_onnx_infer.py.
18+
```python
19+
from qai_appbuilder import onnxwrapper
20+
```
21+
3. Then run the following command.
22+
```bash
23+
python stable_diffusion_1_5_onnx_infer.py --model_root models-qnn --vae_scale 1.0
2124
```
2225
## Output
2326
The output image will be saved to sd15_out.png

tools/onnxwrapper/Samples/stable_diffusion_2_1_onnx/README.md

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -4,20 +4,23 @@
44
This is sample code for using AppBuilder to load Stable Diffusion 2.1 QNN models to HTP and execute inference to generate image.
55

66
## Run the sample code
7-
```
87
If you want to run the sample code with onnx models.
8+
```bash
99
python prepeare_stable_diffusion_onnx_models.py
1010
python stable_diffusion_2_1_onnx_infer.py --model_root ./models-onnx/aislamov_stable-diffusion-2-1-base-onnx --provider dml
11-
11+
```
1212
If you want to run the sample code with qnn models.
13-
python prepeare_stable_diffusion_qnn_models.py
14-
python onnxexec.py stable_diffusion_2_1_onnx_infer.py --model_root models-qnn --vae_scale 1.0
15-
16-
You also can add the following code at beginning of stable_diffusion_2_1_onnx_infer.py.
17-
from qai_appbuilder import onnxwrapper
18-
Then run the following command.
19-
python stable_diffusion_2_1_onnx_infer.py --model_root models-qnn --vae_scale 1.0
20-
13+
1. Run the following command.
14+
```bash
15+
python prepeare_stable_diffusion_qnn_models.py
16+
```
17+
2. Add the following code at beginning of stable_diffusion_2_1_onnx_infer.py.
18+
```python
19+
from qai_appbuilder import onnxwrapper
20+
```
21+
3. Then run the following command.
22+
```bash
23+
python stable_diffusion_2_1_onnx_infer.py --model_root models-qnn --vae_scale 1.0
2124
```
2225
## Output
2326
The output image will be saved to sd21_out.png

tools/onnxwrapper/Samples/whisper_base_en_onnx/README.md

Lines changed: 13 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,20 +4,24 @@
44
This is onnx sample code for using AppBuilder to load whisper_base_en QNN model to HTP and execute inference.
55

66
## Run the sample code
7-
```
87
If you want to run the sample code with onnx models.
8+
```bash
99
python prepare_whisper_onnx_models.py
1010
python whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-onnx\base.en-encoder.onnx --decoder_onnx models-onnx\base.en-decoder.onnx --mel_filters mel_filters.npz
11+
```
1112

1213
If you want to run the sample code with qnn models.
13-
python prepare_whisper_qnn_models.py
14-
python onnxexec.py whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-qnn\whisper_base_en-whisperencoder-snapdragon_x_elite.bin --decoder_onnx models-qnn\whisper_base_en-whisperdecoder-snapdragon_x_elite.bin --mel_filters mel_filters.npz
15-
16-
You also can add the following code at beginning of whisper_base_en_onnx_infer.py.
17-
from qai_appbuilder import onnxwrapper
18-
Then run the following command.
19-
python whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-qnn\whisper_base_en-whisperencoder-snapdragon_x_elite.bin --decoder_onnx models-qnn\whisper_base_en-whisperdecoder-snapdragon_x_elite.bin --mel_filters mel_filters.npz
20-
14+
1. Run the following command.
15+
```bash
16+
python prepare_whisper_qnn_models.py
17+
```
18+
2. Add the following code at beginning of whisper_base_en_onnx_infer.py.
19+
```python
20+
from qai_appbuilder import onnxwrapper
21+
```
22+
3. Then run the following command.
23+
```bash
24+
python whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-qnn\whisper_base_en-whisperencoder-snapdragon_x_elite.bin --decoder_onnx models-qnn\whisper_base_en-whisperdecoder-snapdragon_x_elite.bin --mel_filters mel_filters.npz
2125
```
2226
## Output
2327
You can see "Transcription: And so my fellow Americans, ask not what your country can do for you, ask what you can do for your country." in log.

tools/onnxwrapper/Samples/yolov8_det_onnx/README.md

Lines changed: 14 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,19 +4,24 @@
44
This is onnx sample code for using AppBuilder to load yolov8_det QNN model to HTP and execute inference to predicts bounding boxes and classes of objects in an image.
55

66
## Run the following command:
7-
87
If you want to run the sample code with onnx models.
8+
```bash
99
python prepare_yolov8_det_onnx_models.py
1010
python yolov8_det_onnx_inference.py --model models-onnx\yolov8n.onnx
11+
```
1112

1213
If you want to run the sample code with qnn models.
13-
python prepare_yolov8_det_qnn_models.py
14-
python onnxexec.py yolov8_det_onnx_inference.py --model models-qnn\yolov8_det.bin
15-
16-
You also can add the following code at beginning of yolov8_det_onnx_inference.py.
17-
from qai_appbuilder import onnxwrapper
18-
Then run the following command.
19-
python yolov8_det_onnx_inference.py --model models-qnn\yolov8_det.bin
20-
14+
1. Run the following command.
15+
```bash
16+
python prepare_yolov8_det_qnn_models.py
17+
```
18+
2. Add the following code at beginning of yolov8_det_onnx_inference.py.
19+
```python
20+
from qai_appbuilder import onnxwrapper
21+
```
22+
3. Then run the following command.
23+
```bash
24+
python yolov8_det_onnx_inference.py --model models-qnn\yolov8_det.bin
25+
```
2126
## Output
2227
The output image will be saved to output.jpg

0 commit comments

Comments
 (0)