Skip to content

Commit 818e75a

Browse files
[yolo12] The big gif file is removed
1 parent e6a0d53 commit 818e75a

File tree

1 file changed

+113
-0
lines changed

1 file changed

+113
-0
lines changed

examples/models/yolo12/README.md

Lines changed: 113 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,113 @@
1+
# YOLO12 Detection C++ Inference with ExecuTorch
2+
3+
This example demonstrates how to perform inference of [Ultralytics YOLO12 family](https://docs.ultralytics.com/models/yolo12/) detection models in C++ leveraging the Executorch backends:
4+
- [OpenVINO](../../../backends/openvino/README.md)
5+
- [XNNPACK](../../../backends/xnnpack/README.md)
6+
7+
# Performance Evaluation
8+
9+
| CPU | Model | Backend | Device | Precision | Average Latency, ms |
10+
|--------------------------------|---------|----------|--------|-----------|---------------------|
11+
| Intel(R) Core(TM) Ultra 7 155H | yolo12s | openvino | CPU | FP32 | 88.3549 |
12+
| Intel(R) Core(TM) Ultra 7 155H | yolo12s | openvino | CPU | INT8 | 53.066 |
13+
| Intel(R) Core(TM) Ultra 7 155H | yolo12l | openvino | CPU | FP32 | 317.953 |
14+
| Intel(R) Core(TM) Ultra 7 155H | yolo12l | openvino | CPU | INT8 | 150.846 |
15+
| Intel(R) Core(TM) Ultra 7 155H | yolo12s | openvino | GPU | FP32 | 32.71 |
16+
| Intel(R) Core(TM) Ultra 7 155H | yolo12l | openvino | GPU | FP32 | 70.885 |
17+
| Intel(R) Core(TM) Ultra 7 155H | yolo12s | xnnpack | CPU | FP32 | 169.36 |
18+
| Intel(R) Core(TM) Ultra 7 155H | yolo12l | xnnpack | CPU | FP32 | 436.876 |
19+
20+
21+
# Instructions
22+
23+
### Step 1: Install ExecuTorch
24+
25+
To install ExecuTorch, follow this [guide](https://pytorch.org/executorch/stable/getting-started-setup.html).
26+
27+
### Step 2: Install the backend of your choice
28+
29+
- [OpenVINO backend installation guide](../../../backends/openvino/README.md#build-instructions)
30+
- [XNNPACK backend installation guilde](https://pytorch.org/executorch/stable/tutorial-xnnpack-delegate-lowering.html#running-the-xnnpack-model-with-cmake)
31+
32+
### Step 3: Install the demo requirements
33+
34+
35+
Python demo requirements:
36+
```bash
37+
python -m pip install -r examples/models/yolo12/requirements.txt
38+
```
39+
40+
Demo infenrece dependency - OpenCV library:
41+
https://opencv.org/get-started/
42+
43+
44+
### Step 4: Export the Yolo12 model to the ExecuTorch
45+
46+
47+
OpenVINO:
48+
```bash
49+
python export_and_validate.py --model_name yolo12s --input_dims=[1920,1080] --backend openvino --device CPU
50+
```
51+
52+
OpenVINO quantized model:
53+
```bash
54+
python export_and_validate.py --model_name yolo12s --input_dims=[1920,1080] --backend openvino --quantize --video_input /path/to/calibration/video --device CPU
55+
```
56+
57+
XNNPACK:
58+
```bash
59+
python export_and_validate.py --model_name yolo12s --input_dims=[1920,1080] --backend xnnpack
60+
```
61+
62+
> **_NOTE:_** Quantization for XNNPACK backend is WIP. Please refere to https://github.com/pytorch/executorch/issues/11523 for more details.
63+
64+
Exported model could be validated using the `--validate` key:
65+
66+
```bash
67+
python export_and_validate.py --model_name yolo12s --backend ... --validate dataset_name.yaml
68+
```
69+
70+
A list of available datasets and instructions on how to use a custom dataset can be found [here](https://docs.ultralytics.com/datasets/detect/).
71+
Validation only supports the default `--input_dims`; please do not specify this parameter when using the `--validate` flag.
72+
73+
74+
To get a full parameters description please use the following command:
75+
```bash
76+
python export_and_validate.py --help
77+
```
78+
79+
### Step 5: Build the demo project
80+
81+
OpenVINO:
82+
83+
```bash
84+
cd examples/models/yolo12
85+
mkdir build && cd build
86+
cmake -DCMAKE_BUILD_TYPE=Release -DUSE_OPENVINO_BACKEND=ON ..
87+
make -j$(nproc)
88+
```
89+
90+
XNNPACK:
91+
92+
```bash
93+
cd examples/models/yolo12
94+
mkdir build && cd build
95+
cmake -DCMAKE_BUILD_TYPE=Release -DUSE_XNNPACK_BACKEND=ON ..
96+
make -j$(nproc)
97+
```
98+
99+
### Step 6: Run the demo
100+
101+
```bash
102+
./build/Yolo12DetectionDemo -model_path /path/to/exported/model -input_path /path/to/video/file -output_path /path/to/output/annotated/video
103+
```
104+
105+
To get a full parameters description please use the following command:
106+
```
107+
./build/Yolo12DetectionDemo --help
108+
```
109+
110+
111+
# Credits:
112+
113+
Ultralytics examples: https://github.com/ultralytics/ultralytics/tree/main/examples

0 commit comments

Comments
 (0)