File tree Expand file tree Collapse file tree 7 files changed +84
-68
lines changed
stable_diffusion_1_5_onnx
stable_diffusion_2_1_onnx Expand file tree Collapse file tree 7 files changed +84
-68
lines changed Original file line number Diff line number Diff line change @@ -17,23 +17,18 @@ This tool provides a transparent runtime replacement that allows existing **ONNX
1717- Rapid validation of QNN models generated from ONNX or downloaded from ** AI Hub** .
1818
1919## Usage Overview
20- There are two usages for this tool:
21- ** Usage 1:**
22- 1 . Copy ` onnxexec.py ` and ` onnxwrapper.py ` into your ONNX sample code directory.
23- 2 . Run your original ONNX inference script via:
24-
25- ``` bash
26- python onnxexec.py your_onnx_sample.py
27- ```
28- ** Usage 2:**
2920Import the wrapper explicitly at the beginning of your script:
3021
3122``` python
3223from qai_appbuilder import onnxwrapper
3324```
34-
3525No other changes to your ONNX inference code are required.
3626
27+ ``` bash
28+ python your_onnx_sample.py
29+ ```
30+ Note: If your onnx sample needs some options, please add them when running above command.
31+
3732## Technical Implementation
3833
3934### Overall Architecture
Original file line number Diff line number Diff line change 44This is onnx sample code for using QAI AppBuilder to load facemap_3dmm QNN model to HTP and execute 3D face reconstruction inference to generate parametric 3D face models from 2D facial images. The facemap_3dmm_onnx_infer.py is onnx sample code.
55
66## Run the sample code
7-
8- ```
97If you want to run the sample code with onnx models.
8+ ``` bash
109python prepare_facemap_3dmm_onnx_models.py
1110python facemap_3dmm_onnx_infer.py --model models-onnx\f acemap_3dmm-onnx-float\f acemap_3dmm.onnx --image input.jpg
12-
11+ ```
1312If you want to run the sample code with qnn models.
14- python prepare_facemap_3dmm_qnn_models.py
15- python onnxexec.py facemap_3dmm_onnx_infer.py --model models-qnn\facemap_3dmm.bin --image input.jpg
16-
17- You also can add the following code at beginning of facemap_3dmm_onnx_infer.py.
18- from qai_appbuilder import onnxwrapper
19- Then run the following command.
20- python facemap_3dmm_onnx_infer.py --model models-qnn\facemap_3dmm.bin --image input.jpg
21-
13+ 1 . Run the following command.
14+ ``` bash
15+ python prepare_facemap_3dmm_qnn_models.py
16+ ```
17+ 2 . Add the following code at beginning of facemap_3dmm_onnx_infer.py.
18+ ``` python
19+ from qai_appbuilder import onnxwrapper
20+ ```
21+ 3 . Then run the following command.
22+ ``` bash
23+ python facemap_3dmm_onnx_infer.py --model models-qnn\f acemap_3dmm.bin --image input.jpg
2224```
2325## Output
2426You can see output.jpg in out folder.
Original file line number Diff line number Diff line change 44This is onnx sample code for using AppBuilder to load real_esrgan_x4plus QNN model to HTP and execute inference to generate image.
55
66## Run the sample code
7- ```
87If you want to run the sample code with onnx models.
8+ ``` bash
99python prepare_real_esrgan_x4plus_onnx_models.py
1010python real_esrgan_x4plus_onnx_inference.py --model models-onnx\r eal_esrgan_x4plus-onnx-float\r eal_esrgan_x4plus.onnx
11+ ```
1112
1213If you want to run the sample code with qnn models.
13- python prepare_real_esrgan_x4plus_qnn_models.py
14- python onnxexec.py real_esrgan_x4plus_onnx_inference.py --model models-qnn\real_esrgan_x4plus.bin --tile 512
15-
16- You also can add the following code at beginning of real_esrgan_x4plus_onnx_inference.py.
17- from qai_appbuilder import onnxwrapper
18- Then run the following command.
19- python real_esrgan_x4plus_onnx_inference.py --model models-qnn\real_esrgan_x4plus.bin --tile 512
20-
14+ 1 . Run the following command.
15+ ``` bash
16+ python prepare_real_esrgan_x4plus_qnn_models.py
17+ ```
18+ 2 . Add the following code at beginning of real_esrgan_x4plus_onnx_inference.py.
19+ ``` python
20+ from qai_appbuilder import onnxwrapper
21+ ```
22+ 3 . Then run the following command.
23+ ``` bash
24+ python real_esrgan_x4plus_onnx_inference.py --model models-qnn\r eal_esrgan_x4plus.bin --tile 512
2125```
2226## Output
2327The output image will be saved to output_x4.png
Original file line number Diff line number Diff line change 44This is sample code for using AppBuilder to load Stable Diffusion 1.5 QNN models to HTP and execute inference to generate image.
55
66## Run the sample code
7- ```
87If you want to run the sample code with onnx models.
8+ ``` bash
99python prepeare_stable_diffusion_onnx_models.py
1010python stable_diffusion_1_5_onnx_infer.py --model_root models-onnx\m odularai_stable-diffusion-1-5-onnx --provider cpu --out sd15_out.png
11-
11+ ```
1212If you want to run the sample code with qnn models.
13- python prepeare_stable_diffusion_qnn_models.py
14- python onnxexec.py stable_diffusion_1_5_onnx_infer.py --model_root models-qnn --vae_scale 1.0
15-
16- You also can add the following code at beginning of stable_diffusion_1_5_onnx_infer.py.
17- from qai_appbuilder import onnxwrapper
18- Then run the following command.
19- python stable_diffusion_1_5_onnx_infer.py --model_root models-qnn --vae_scale 1.0
20-
13+ 1 . Run the following command.
14+ ``` bash
15+ python prepeare_stable_diffusion_qnn_models.py
16+ ```
17+ 2 . Add the following code at beginning of stable_diffusion_1_5_onnx_infer.py.
18+ ``` python
19+ from qai_appbuilder import onnxwrapper
20+ ```
21+ 3 . Then run the following command.
22+ ``` bash
23+ python stable_diffusion_1_5_onnx_infer.py --model_root models-qnn --vae_scale 1.0
2124```
2225## Output
2326The output image will be saved to sd15_out.png
Original file line number Diff line number Diff line change 44This is sample code for using AppBuilder to load Stable Diffusion 2.1 QNN models to HTP and execute inference to generate image.
55
66## Run the sample code
7- ```
87If you want to run the sample code with onnx models.
8+ ``` bash
99python prepeare_stable_diffusion_onnx_models.py
1010python stable_diffusion_2_1_onnx_infer.py --model_root ./models-onnx/aislamov_stable-diffusion-2-1-base-onnx --provider dml
11-
11+ ```
1212If you want to run the sample code with qnn models.
13- python prepeare_stable_diffusion_qnn_models.py
14- python onnxexec.py stable_diffusion_2_1_onnx_infer.py --model_root models-qnn --vae_scale 1.0
15-
16- You also can add the following code at beginning of stable_diffusion_2_1_onnx_infer.py.
17- from qai_appbuilder import onnxwrapper
18- Then run the following command.
19- python stable_diffusion_2_1_onnx_infer.py --model_root models-qnn --vae_scale 1.0
20-
13+ 1 . Run the following command.
14+ ``` bash
15+ python prepeare_stable_diffusion_qnn_models.py
16+ ```
17+ 2 . Add the following code at beginning of stable_diffusion_2_1_onnx_infer.py.
18+ ``` python
19+ from qai_appbuilder import onnxwrapper
20+ ```
21+ 3 . Then run the following command.
22+ ``` bash
23+ python stable_diffusion_2_1_onnx_infer.py --model_root models-qnn --vae_scale 1.0
2124```
2225## Output
2326The output image will be saved to sd21_out.png
Original file line number Diff line number Diff line change 44This is onnx sample code for using AppBuilder to load whisper_base_en QNN model to HTP and execute inference.
55
66## Run the sample code
7- ```
87If you want to run the sample code with onnx models.
8+ ``` bash
99python prepare_whisper_onnx_models.py
1010python whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-onnx\b ase.en-encoder.onnx --decoder_onnx models-onnx\b ase.en-decoder.onnx --mel_filters mel_filters.npz
11+ ```
1112
1213If you want to run the sample code with qnn models.
13- python prepare_whisper_qnn_models.py
14- python onnxexec.py whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-qnn\whisper_base_en-whisperencoder-snapdragon_x_elite.bin --decoder_onnx models-qnn\whisper_base_en-whisperdecoder-snapdragon_x_elite.bin --mel_filters mel_filters.npz
15-
16- You also can add the following code at beginning of whisper_base_en_onnx_infer.py.
17- from qai_appbuilder import onnxwrapper
18- Then run the following command.
19- python whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-qnn\whisper_base_en-whisperencoder-snapdragon_x_elite.bin --decoder_onnx models-qnn\whisper_base_en-whisperdecoder-snapdragon_x_elite.bin --mel_filters mel_filters.npz
20-
14+ 1 . Run the following command.
15+ ``` bash
16+ python prepare_whisper_qnn_models.py
17+ ```
18+ 2 . Add the following code at beginning of whisper_base_en_onnx_infer.py.
19+ ``` python
20+ from qai_appbuilder import onnxwrapper
21+ ```
22+ 3 . Then run the following command.
23+ ``` bash
24+ python whisper_base_en_onnx_infer.py --audio_file jfk.wav --encoder_onnx models-qnn\w hisper_base_en-whisperencoder-snapdragon_x_elite.bin --decoder_onnx models-qnn\w hisper_base_en-whisperdecoder-snapdragon_x_elite.bin --mel_filters mel_filters.npz
2125```
2226## Output
2327You can see "Transcription: And so my fellow Americans, ask not what your country can do for you, ask what you can do for your country." in log.
Original file line number Diff line number Diff line change 44This is onnx sample code for using AppBuilder to load yolov8_det QNN model to HTP and execute inference to predicts bounding boxes and classes of objects in an image.
55
66## Run the following command:
7-
87If you want to run the sample code with onnx models.
8+ ``` bash
99python prepare_yolov8_det_onnx_models.py
1010python yolov8_det_onnx_inference.py --model models-onnx\y olov8n.onnx
11+ ```
1112
1213If you want to run the sample code with qnn models.
13- python prepare_yolov8_det_qnn_models.py
14- python onnxexec.py yolov8_det_onnx_inference.py --model models-qnn\yolov8_det.bin
15-
16- You also can add the following code at beginning of yolov8_det_onnx_inference.py.
17- from qai_appbuilder import onnxwrapper
18- Then run the following command.
19- python yolov8_det_onnx_inference.py --model models-qnn\yolov8_det.bin
20-
14+ 1 . Run the following command.
15+ ``` bash
16+ python prepare_yolov8_det_qnn_models.py
17+ ```
18+ 2 . Add the following code at beginning of yolov8_det_onnx_inference.py.
19+ ``` python
20+ from qai_appbuilder import onnxwrapper
21+ ```
22+ 3 . Then run the following command.
23+ ``` bash
24+ python yolov8_det_onnx_inference.py --model models-qnn\y olov8_det.bin
25+ ```
2126## Output
2227The output image will be saved to output.jpg
You can’t perform that action at this time.
0 commit comments