Skip to content

Commit f38fb66

Browse files
author
gz-chenxiangrong
committed
v5Lite-1.4
0 parents  commit f38fb66

File tree

238 files changed

+25842
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

238 files changed

+25842
-0
lines changed

LICENSE

Lines changed: 674 additions & 0 deletions
Large diffs are not rendered by default.

README.md

Lines changed: 261 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,261 @@
1+
# YOLOv5-Lite:Lighter, faster and easier to deploy ![](https://zenodo.org/badge/DOI/10.5281/zenodo.5241425.svg)
2+
3+
![image](https://user-images.githubusercontent.com/82716366/135564164-3ec169c8-93a7-4ea3-b0dc-40f1059601ef.png)
4+
5+
Perform a series of ablation experiments on yolov5 to make it lighter (smaller Flops, lower memory, and fewer parameters) and faster (add shuffle channel, yolov5 head for channel reduce. It can infer at least 10+ FPS On the Raspberry Pi 4B when input the frame with 320×320) and is easier to deploy (removing the Focus layer and four slice operations, reducing the model quantization accuracy to an acceptable range).
6+
7+
## Comparison of ablation experiment results
8+
9+
ID|Model | Input_size|Flops| Params | Size(M) |Map@0.5|Map@.5:0.95
10+
:-----:|:-----:|:-----:|:----------:|:----:|:----:|:----:|:----:|
11+
001| yolo-fastest| 320×320|0.25G|0.35M|1.4| 24.4| -
12+
002| YOLOv5-Lite<sub>e</sub><sup>ours</sup>|320×320|0.73G|0.78M|1.7| 35.1|-|
13+
003| NanoDet-m| 320×320| 0.72G|0.95M|1.8|- |20.6
14+
004| yolo-fastest-xl| 320×320|0.72G|0.92M|3.5| 34.3| -
15+
005| YOLOX<sub>Nano</sub>|416×416|1.08G|0.91M|7.3(fp32)| -|25.8|
16+
006| yolov3-tiny| 416×416| 6.96G|6.06M|23.0| 33.1|16.6
17+
007| yolov4-tiny| 416×416| 5.62G|8.86M| 33.7|40.2|21.7
18+
008| YOLOv5-Lite<sub>s</sub><sup>ours</sup>| 416×416|1.66G |1.64M|3.4| 42.0|25.2
19+
009| YOLOv5-Lite<sub>c</sub><sup>ours</sup>| 512×512|5.92G |4.57M|9.2| 50.9|32.5|
20+
010| NanoDet-EfficientLite2| 512×512| 7.12G|4.71M|18.3|- |32.6
21+
011| YOLOv5s(6.0)| 640×640| 16.5G|7.23M|14.0| 56.0|37.2
22+
012| YOLOv5-Lite<sub>g</sub><sup>ours</sup>| 640×640|15.6G |5.39M|10.9| 57.6|39.1|
23+
24+
See the wiki: https://github.com/ppogg/YOLOv5-Lite/wiki/Test-the-map-of-models-about-coco
25+
26+
## Comparison on different platforms
27+
28+
Equipment|Computing backend|System|Input|Framework|v5lite-e|v5lite-s|v5lite-c|v5lite-g|YOLOv5s
29+
:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:
30+
Inter|@i5-10210U|window(x86)|640×640|openvino|-|-|46ms|-|131ms
31+
Nvidia|@RTX 2080Ti|Linux(x86)|640×640|torch|-|-|-|15ms|14ms
32+
Redmi K30|@Snapdragon 730G|Android(armv8)|320×320|ncnn|27ms|38ms|-|-|163ms
33+
Xiaomi 10|@Snapdragon 865|Android(armv8)|320×320|ncnn|10ms|14ms|-|-|163ms
34+
Raspberrypi 4B|@ARM Cortex-A72|Linux(arm64)|320×320|ncnn|-|84ms|-|-|371ms
35+
Raspberrypi 4B|@ARM Cortex-A72|Linux(arm64)|320×320|mnn|-|76ms|-|-|356ms
36+
37+
* The above is a 4-thread test benchmark
38+
* Raspberrypi 4B enable bf16s optimization,[Raspberrypi 64 Bit OS](http://downloads.raspberrypi.org/raspios_arm64/images/raspios_arm64-2020-08-24/)
39+
40+
### qq交流群:993965802
41+
42+
## ·Model Zoo·
43+
44+
#### @v5lite-e:
45+
46+
Model|Size|Backbone|Head|Framework|Design for
47+
:---:|:---:|:---:|:---:|:---:|:---
48+
v5Lite-e.pt|1.7m|shufflenetv2(Megvii)|v5Litee-head|Pytorch|Arm-cpu
49+
v5Lite-e.bin<br />v5Lite-e.param|1.7m|shufflenetv2|v5Litee-head|ncnn|Arm-cpu
50+
v5Lite-e-int8.bin<br />v5Lite-e-int8.param|0.9m|shufflenetv2|v5Litee-head|ncnn|Arm-cpu
51+
v5Lite-e-fp32.mnn|3.0m|shufflenetv2|v5Litee-head|mnn|Arm-cpu
52+
v5Lite-e-fp32.tnnmodel<br />v5Lite-e-fp32.tnnproto|2.9m|shufflenetv2|v5Litee-head|tnn|arm-cpu
53+
54+
#### @v5lite-s:
55+
56+
Model|Size|Backbone|Head|Framework|Design for
57+
:---:|:---:|:---:|:---:|:---:|:---
58+
v5Lite-s.pt|3.4m|shufflenetv2(Megvii)|v5Lites-head|Pytorch|Arm-cpu
59+
v5Lite-s.bin<br />v5Lite-s.param|3.3m|shufflenetv2|v5Lites-head|ncnn|Arm-cpu
60+
v5Lite-s-int8.bin<br />v5Lite-s-int8.param|1.7m|shufflenetv2|v5Lites-head|ncnn|Arm-cpu
61+
v5Lite-s.mnn|3.3m|shufflenetv2|v5Lites-head|mnn|Arm-cpu
62+
v5Lite-s-int4.mnn|987k|shufflenetv2|v5Lites-head|mnn|Arm-cpu
63+
v5Lite-s-fp16.bin<br />v5Lite-s-fp16.xml|3.4m|shufflenetv2|v5Lites-head|openvivo|x86-cpu
64+
v5Lite-s-fp32.bin<br />v5Lite-s-fp32.xml|6.8m|shufflenetv2|v5Lites-head|openvivo|x86-cpu
65+
v5Lite-s-fp16.tflite|3.3m|shufflenetv2|v5Lites-head|tflite|arm-cpu
66+
v5Lite-s-fp32.tflite|6.7m|shufflenetv2|v5Lites-head|tflite|arm-cpu
67+
v5Lite-s-int8.tflite|1.8m|shufflenetv2|v5Lites-head|tflite|arm-cpu
68+
69+
#### @v5lite-c:
70+
71+
Model|Size|Backbone|Head|Framework|Design for
72+
:---:|:---:|:---:|:---:|:---:|:---:
73+
v5Lite-c.pt|9m|PPLcnet(Baidu)|v5Litec-head|Pytorch|x86-cpu / x86-vpu
74+
v5Lite-c.bin<br />v5Lite-c.xml|8.7m|PPLcnet|v5Litec-head|openvivo|x86-cpu / x86-vpu
75+
76+
#### @v5lite-g:
77+
78+
Model|Size|Backbone|Head|Framework|Design for
79+
:---:|:---:|:---:|:---:|:---:|:---:
80+
v5Lite-g.pt|10.9m|Repvgg(Tsinghua)|v5Liteg-head|Pytorch|x86-gpu / arm-gpu / arm-npu
81+
v5Lite-g-int8.engine|8.5m|Repvgg|v5Liteg-head|Tensorrt|x86-gpu / arm-gpu / arm-npu
82+
v5lite-g-int8.tmfile|8.7m|Repvgg|v5Liteg-head|Tengine| arm-npu
83+
84+
#### Download Link:
85+
86+
> - [ ] `v5lite-e.pt`: | [Baidu Drive](https://pan.baidu.com/s/1bjXo7KIFkOnB3pxixHeMPQ) | [Google Drive](https://drive.google.com/file/d/1_DvT_qjznuE-ev_pDdGKwRV3MjZ3Zos8/view?usp=sharing) |<br>
87+
>> |──────`ncnn-fp16`: | [Baidu Drive]() | [Google Drive](https://drive.google.com/drive/folders/1w4mThJmqjhT1deIXMQAQ5xjWI3JNyzUl?usp=sharing) |<br>
88+
>> |──────`ncnn-int8`: | [Baidu Drive]() | [Google Drive](https://drive.google.com/drive/folders/1YNtNVWlRqN8Dwc_9AtRkN0LFkDeJ92gN?usp=sharing) |<br>
89+
>> |──────`mnn-fp32`: | [Baidu Drive]() | [Google Drive](https://drive.google.com/drive/folders/1Kha3vQF-7qc5i-GFryInStgTFisGL5vq?usp=sharing) |<br>
90+
>> └──────tnn-fp32`: | [Baidu Drive]() | [Google Drive](https://drive.google.com/drive/folders/1VWmI2BC9MjH7BsrOz4VlSDVnZMXaxGOE?usp=sharing) |<br>
91+
> - [ ] `v5lite-s.pt`: | [Baidu Drive](https://pan.baidu.com/s/1j0n0K1kqfv1Ouwa2QSnzCQ) | [Google Drive](https://drive.google.com/file/d/1ccLTmGB5AkKPjDOyxF3tW7JxGWemph9f/view?usp=sharing) |<br>
92+
>> |──────`ncnn-fp16`: | [Baidu Drive](https://pan.baidu.com/s/1kWtwx1C0OTTxbwqJyIyXWg) | [Google Drive](https://drive.google.com/drive/folders/1w4mThJmqjhT1deIXMQAQ5xjWI3JNyzUl?usp=sharing) |<br>
93+
>> |──────`ncnn-int8`: | [Baidu Drive](https://pan.baidu.com/s/1QX6-oNynrW-f3i0P0Hqe4w) | [Google Drive](https://drive.google.com/drive/folders/1YNtNVWlRqN8Dwc_9AtRkN0LFkDeJ92gN?usp=sharing) |<br>
94+
>> |──────`mnn-fp16`: | [Baidu Drive](https://pan.baidu.com/s/12lOtPTl4xujWm5BbFJh3zA) | [Google Drive](https://drive.google.com/drive/folders/1PpFoZ4b8mVs1GmMxgf0WUtXUWaGK_JZe?usp=sharing) |<br>
95+
>> |──────`mnn-int4`: | [Baidu Drive](https://pan.baidu.com/s/11fbjFi18xkq4ltAKUKDOCA) | [Google Drive](https://drive.google.com/drive/folders/1mSU8g94c77KKsHC-07p5V3tJOZYPQ-g6?usp=sharing) |<br>
96+
>> └──────`tengine-fp32`: | [Baidu Drive](https://pan.baidu.com/s/123r630O8Fco7X59wFU1crA) | [Google Drive](https://drive.google.com/drive/folders/1VWmI2BC9MjH7BsrOz4VlSDVnZMXaxGOE?usp=sharing) |<br>
97+
> - [ ] `v5lite-c.pt`: [Baidu Drive](https://pan.baidu.com/s/1obs6uRB79m8e3uASVR6P1A) | [Google Drive](https://drive.google.com/file/d/1lHYRQKjqKCRXghUjwWkUB0HQ8ccKH6qa/view?usp=sharing) |<br>
98+
>> └──────`openvino-fp16`: | [Baidu Drive](https://pan.baidu.com/s/18p8HAyGJdmo2hham250b4A) | [Google Drive](https://drive.google.com/drive/folders/1s4KPSC4B0shG0INmQ6kZuPLnlUKAATyv?usp=sharing) |<br>
99+
> - [ ] `v5lite-g.pt`: | [Baidu Drive](https://pan.baidu.com/s/14zdTiTMI_9yTBgKGbv9pQw) | [Google Drive](https://drive.google.com/file/d/1oftzqOREGqDCerf7DtD5BZp9YWELlkMe/view?usp=sharing) |<br>
100+
101+
Baidu Drive Password: `pogg`
102+
103+
#### v5lite-s model: TFLite Float32, Float16, INT8, Dynamic range quantization, ONNX, TFJS, TensorRT, OpenVINO IR FP32/FP16, Myriad Inference Engin Blob, CoreML
104+
[https://github.com/PINTO0309/PINTO_model_zoo/tree/main/180_YOLOv5-Lite](https://github.com/PINTO0309/PINTO_model_zoo/tree/main/180_YOLOv5-Lite)
105+
106+
#### Thanks for PINTO0309:[https://github.com/PINTO0309/PINTO_model_zoo/tree/main/180_YOLOv5-Lite](https://github.com/PINTO0309/PINTO_model_zoo/tree/main/180_YOLOv5-Lite)
107+
108+
## Thanks for contributors
109+
We welcome your comments! We want to make contributing to YOLOv5-Lite as easy and transparent as possible. Thanks to all our contributors!
110+
111+
<a href="https://github.com/ppogg/YOLOv5-Lite/graphs/contributors">
112+
<img src="https://contrib.rocks/image?repo=ppogg/YOLOv5-Lite" />
113+
</a>
114+
115+
Made with [contrib.rocks](https://contrib.rocks).
116+
117+
## <div>How to use</div>
118+
119+
<details open>
120+
<summary>Install</summary>
121+
122+
[**Python>=3.6.0**](https://www.python.org/) is required with all
123+
[requirements.txt](https://github.com/ppogg/YOLOv5-Lite/blob/master/requirements.txt) installed including
124+
[**PyTorch>=1.7**](https://pytorch.org/get-started/locally/):
125+
<!-- $ sudo apt update && apt install -y libgl1-mesa-glx libsm6 libxext6 libxrender-dev -->
126+
127+
```bash
128+
$ git clone https://github.com/ppogg/YOLOv5-Lite
129+
$ cd YOLOv5-Lite
130+
$ pip install -r requirements.txt
131+
```
132+
133+
</details>
134+
135+
<details>
136+
<summary>Inference with detect.py</summary>
137+
138+
`detect.py` runs inference on a variety of sources, downloading models automatically from
139+
the [latest YOLOv5-Lite release](https://github.com/ppogg/YOLOv5-Lite/releases) and saving results to `runs/detect`.
140+
141+
```bash
142+
$ python detect.py --source 0 # webcam
143+
file.jpg # image
144+
file.mp4 # video
145+
path/ # directory
146+
path/*.jpg # glob
147+
'https://youtu.be/NUsoVlDFqZg' # YouTube
148+
'rtsp://example.com/media.mp4' # RTSP, RTMP, HTTP stream
149+
```
150+
151+
</details>
152+
153+
<details open>
154+
<summary>Training</summary>
155+
156+
```bash
157+
$ python train.py --data coco.yaml --cfg v5lite-e.yaml --weights v5lite-e.pt --batch-size 128
158+
v5lite-s.yaml --weights v5lite-s.pt --batch-size 128
159+
v5lite-c.yaml v5lite-c.pt 96
160+
v5lite-g.yaml v5lite-g.pt 64
161+
```
162+
163+
If you use multi-gpu. It's faster several times:
164+
165+
```bash
166+
$ python -m torch.distributed.launch --nproc_per_node 2 train.py
167+
```
168+
169+
</details>
170+
171+
</details>
172+
173+
<details open>
174+
<summary>DataSet</summary>
175+
176+
Training set and test set distribution (the path with xx.jpg)
177+
178+
```bash
179+
train: ../coco/images/train2017/
180+
val: ../coco/images/val2017/
181+
```
182+
```bash
183+
├── images # xx.jpg example
184+
│ ├── train2017
185+
│ │ ├── 000001.jpg
186+
│ │ ├── 000002.jpg
187+
│ │ └── 000003.jpg
188+
│ └── val2017
189+
│ ├── 100001.jpg
190+
│ ├── 100002.jpg
191+
│ └── 100003.jpg
192+
└── labels # xx.txt example
193+
├── train2017
194+
│ ├── 000001.txt
195+
│ ├── 000002.txt
196+
│ └── 000003.txt
197+
└── val2017
198+
├── 100001.txt
199+
├── 100002.txt
200+
└── 100003.txt
201+
```
202+
203+
</details>
204+
205+
<details open>
206+
<summary>model hub</summary>
207+
208+
Here, the original components of YOLOv5 and the reproduced components of YOLOv5-Lite are organized and stored in the [model hub](https://github.com/ppogg/YOLOv5-Lite/tree/master/models/hub)
209+
210+
![modelhub](https://user-images.githubusercontent.com/82716366/146787562-e2c1c4c1-726e-4efc-9eae-d92f34333e8d.jpg)
211+
212+
Updating ...
213+
214+
</details>
215+
216+
## How to deploy
217+
218+
[**ncnn**](https://github.com/ppogg/YOLOv5-Lite/blob/master/ncnn/README.md) for arm-cpu
219+
220+
[**mnn**](https://github.com/ppogg/YOLOv5-Lite/blob/master/mnn/README.md) for arm-cpu
221+
222+
[**openvino**](https://github.com/ppogg/YOLOv5-Lite/blob/master/openvino/README.md) x86-cpu or x86-vpu
223+
224+
[**tensorrt**](https://github.com/ppogg/YOLOv5-Lite/tree/master/tensorrt) for arm-gpu or arm-npu or x86-gpu
225+
226+
[**Android**](https://github.com/ppogg/YOLOv5-Lite/blob/master/Android/ncnn-android-yolov5/README.md) for arm-cpu
227+
228+
## Android_demo
229+
230+
This is a Redmi phone, the processor is Snapdragon 730G, and yolov5-lite is used for detection. The performance is as follows:
231+
232+
link: https://github.com/ppogg/YOLOv5-Lite/tree/master/ncnn_Android
233+
234+
Android_v5Lite-s: https://drive.google.com/file/d/1CtohY68N2B9XYuqFLiTp-Nd2kuFWgAUR/view?usp=sharing
235+
236+
Android_v5Lite-g: https://drive.google.com/file/d/1FnvkWxxP_aZwhi000xjIuhJ_OhqOUJcj/view?usp=sharing
237+
238+
new android app:[link] https://pan.baidu.com/s/1PRhW4fI1jq8VboPyishcIQ [keyword] pogg
239+
240+
<img src="https://user-images.githubusercontent.com/82716366/149959014-5f027b1c-67b6-47e2-976b-59a7c631b0f2.jpg" width="650"/><br/>
241+
242+
## More detailed explanation
243+
244+
Detailed model link:
245+
246+
[1] https://zhuanlan.zhihu.com/p/400545131
247+
248+
[2] https://zhuanlan.zhihu.com/p/410874403
249+
250+
[3] https://blog.csdn.net/weixin_45829462/article/details/119787840
251+
252+
[4] https://zhuanlan.zhihu.com/p/420737659
253+
254+
## Reference
255+
256+
https://github.com/ultralytics/yolov5
257+
258+
https://github.com/megvii-model/ShuffleNet-Series
259+
260+
https://github.com/Tencent/ncnn
261+
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
The yolov5-lite object detection
2+
3+
This is a sample ncnn android project, it depends on ncnn library and opencv
4+
5+
https://github.com/Tencent/ncnn
6+
7+
https://github.com/nihui/opencv-mobile
8+
9+
## model_zoo
10+
11+
https://github.com/ppogg/ncnn-android-v5lite/tree/master/app/src/main/assets
12+
13+
14+
## how to build and run
15+
### step1
16+
https://github.com/Tencent/ncnn/releases
17+
18+
* Download ncnn-YYYYMMDD-android-vulkan.zip or build ncnn for android yourself
19+
* Extract ncnn-YYYYMMDD-android-vulkan.zip into **app/src/main/jni** and change the **ncnn_DIR** path to yours in **app/src/main/jni/CMakeLists.txt**
20+
21+
### step2
22+
https://github.com/nihui/opencv-mobile
23+
24+
* Download opencv-mobile-XYZ-android.zip
25+
* Extract opencv-mobile-XYZ-android.zip into **app/src/main/jni** and change the **OpenCV_DIR** path to yours in **app/src/main/jni/CMakeLists.txt**
26+
27+
### step3
28+
```
29+
cd ncnn_Android/ncnn-android-yolov5/app/src/main/assets
30+
wget all the *.param and *.bin
31+
```
32+
33+
### step4
34+
* Open this project with Android Studio, build it and enjoy!
35+
36+
## some notes
37+
* Android ndk camera is used for best efficiency
38+
* Crash may happen on very old devices for lacking HAL3 camera interface
39+
* All models are manually modified to accept dynamic input shape
40+
* Most small models run slower on GPU than on CPU, this is common
41+
* FPS may be lower in dark environment because of longer camera exposure time
42+
43+
## screenshot
44+
<img src="https://user-images.githubusercontent.com/82716366/151705519-de3ad1f1-e297-4125-989a-04e49dcf2876.jpg" width="600"/><br/>
45+
46+
<img src="https://pic1.zhimg.com/80/v2-c013df3638fd41d10103ea259b18e588_720w.jpg" width="300"/><br/>
47+
48+
## reference
49+
https://github.com/nihui/ncnn-android-yolov5
50+
51+
https://github.com/FeiGeChuanShu/ncnn-android-yolox
52+
53+
https://github.com/ppogg/YOLOv5-Lite
Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
apply plugin: 'com.android.application'
2+
3+
android {
4+
compileSdkVersion 24
5+
buildToolsVersion "29.0.2"
6+
7+
defaultConfig {
8+
applicationId "com.ncnn.v5lite.demo"
9+
archivesBaseName = "$applicationId"
10+
11+
minSdkVersion 24
12+
}
13+
14+
externalNativeBuild {
15+
cmake {
16+
version "3.10.2"
17+
path file('src/main/jni/CMakeLists.txt')
18+
}
19+
}
20+
21+
dependencies {
22+
implementation 'com.android.support:support-v4:24.2.1'
23+
}
24+
}
Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
<?xml version="1.0" encoding="utf-8"?>
2+
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
3+
package="ncnn.v5lite.demo"
4+
android:versionCode="1"
5+
android:versionName="1.1">
6+
<uses-permission android:name="android.permission.CAMERA" />
7+
<uses-feature android:name="android.hardware.camera2.full" />
8+
9+
<application android:label="@string/app_name">
10+
<activity android:name="MainActivity"
11+
android:label="@string/app_name"
12+
android:icon="@mipmap/ic_launcher"
13+
android:screenOrientation="portrait">
14+
<intent-filter>
15+
<action android:name="android.intent.action.MAIN" />
16+
<category android:name="android.intent.category.LAUNCHER" />
17+
</intent-filter>
18+
</activity>
19+
</application>
20+
</manifest>
175 KB
Loading

0 commit comments

Comments
 (0)