Skip to content

Commit 1b17b8f

Browse files
authored
chore: update readme
* chore: update readme (draft wip) * chore: cleanup * chore: update docs * docs: update benchmark * docs: update comments
1 parent 8e6ad23 commit 1b17b8f

File tree

2 files changed

+145
-61
lines changed

2 files changed

+145
-61
lines changed

README.md

Lines changed: 75 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -1,65 +1,115 @@
1-
# SenseCraft Model Assistant by Seeed Studio
2-
31
<div align="center">
42
<img width="20%" src="https://files.seeedstudio.com/sscma/docs/images/SSCMA-Hero.png"/>
5-
<h3> <a href="https://sensecraftma.seeed.cc"> Documentation </a> | <a href="https://github.com/Seeed-Studio/sscma-model-zoo"> Model Zoo </a> </h3>
6-
</div>
73

8-
English | [简体中文](README_zh-CN.md)
4+
<h1>
5+
SenseCraft Model Assistant by Seeed Studio
6+
</h1>
7+
8+
[![docs-build](https://github.com/Seeed-Studio/ModelAssistant/actions/workflows/docs-build.yml/badge.svg)](https://github.com/Seeed-Studio/ModelAssistant/actions/workflows/docs-build.yml)
9+
[![functional-test](https://github.com/Seeed-Studio/ModelAssistant/actions/workflows/functional-test.yml/badge.svg?branch=main)](https://github.com/Seeed-Studio/ModelAssistant/actions/workflows/functional-test.yml)
10+
![GitHub Release](https://img.shields.io/github/v/release/Seeed-Studio/ModelAssistant)
11+
[![license](https://img.shields.io/github/license/Seeed-Studio/ModelAssistant.svg)](https://github.com/Seeed-Studio/ModelAssistant/blob/main/LICENSE)
12+
[![Average time to resolve an issue](http://isitmaintained.com/badge/resolution/Seeed-Studio/ModelAssistant.svg)](http://isitmaintained.com/project/Seeed-Studio/ModelAssistant "Average time to resolve an issue")
13+
[![Percentage of issues still open](http://isitmaintained.com/badge/open/Seeed-Studio/ModelAssistant.svg)](http://isitmaintained.com/project/Seeed-Studio/ModelAssistant "Percentage of issues still open")
14+
15+
<h3>
16+
<a href="https://sensecraftma.seeed.cc"> Documentation </a> |
17+
<a href="https://sensecraftma.seeed.cc/introduction/installation"> Installation </a> |
18+
<a href="https://github.com/Seeed-Studio/ModelAssistant/tree/main/notebooks"> Colab </a> |
19+
<a href="https://github.com/Seeed-Studio/sscma-model-zoo"> Model Zoo </a> |
20+
<a href="https://seeed-studio.github.io/SenseCraft-Web-Toolkit"> Deploy </a> -
21+
<a href="README_zh-CN.md"> 简体中文 </a>
22+
</h3>
23+
24+
</div>
925

1026
## Introduction
1127

12-
Seeed SenseCraft Model Assistant (or simply SSCMA) is an open-source project focused on embedded AI. We have optimized excellent algorithms from [OpenMMLab](https://github.com/open-mmlab) for real-world scenarios and made implementation more user-friendly, achieving faster and more accurate inference on embedded devices.
28+
**S**eeed **S**ense**C**raft **M**odel **A**ssistant is an open-source project focused on providing state-of-the-art AI algorithms for embedded devices. It is designed to help developers and makers to easily deploy various AI models on low-cost hardwares, such as microcontrollers and single-board computers (SBCs).
29+
30+
<div align="center">
31+
32+
<img width="98%" src="https://files.seeedstudio.com/sscma/docs/images/SSCMA-Deploy.gif"/>
33+
34+
</div>
35+
36+
**Real-world deploy examples on MCUs with less than 0.3 Watts power consumption.*
1337

14-
## What's included?
38+
### 🤝 User-friendly
1539

16-
Currently we support the following directions of algorithms:
40+
SSCMA provides a user-friendly platform that allows users to easily perform training on collected data, and to better understand the performance of algorithms through visualizations generated during the training process.
41+
42+
### 🔋 Models with low computing power and high performance
43+
44+
SSCMA focuses on end-side AI algorithm research, and the algorithm models can be deployed on microprocessors, similar to [ESP32](https://www.espressif.com.cn/en/products/socs/esp32), some [Arduino](https://arduino.cc) development boards, and even in embedded SBCs such as [Raspberry Pi](https://www.raspberrypi.org).
45+
46+
### 🗂️ Supports multiple formats for model export
47+
48+
[TensorFlow Lite](https://www.tensorflow.org/lite) is mainly used in microcontrollers, while [ONNX](https://onnx.ai) is mainly used in devices with Embedded Linux. There are some special formats such as [TensorRT](https://developer.nvidia.com/tensorrt), [OpenVINO](https://docs.openvino.ai) which are already well supported by OpenMMLab. SSCMA has added TFLite model export for microcontrollers, which can be directly converted to [TensorRT](https://developer.nvidia.com/tensorrt), [UF2](https://github.com/microsoft/uf2) format and drag-and-drop into the device for deployment.
49+
50+
## Features
51+
52+
We have optimized excellent algorithms from [OpenMMLab](https://github.com/open-mmlab) for real-world scenarios and made implementation more user-friendly, achieving faster and more accurate inference. Currently we support the following directions of algorithms:
1753

1854
### 🔍 Anomaly Detection
1955

2056
In the real world, anomalous data is often difficult to identify, and even if it can be identified, it requires a very high cost. The anomaly detection algorithm collects normal data in a low-cost way, and anything outside normal data is considered anomalous.
2157

2258
### 👁️ Computer Vision
2359

24-
Here we provide a number of computer vision algorithms such as object detection, image classification, image segmentation and pose estimation. However, these algorithms cannot run on low-cost hardware. SSCMA optimizes these computer vision algorithms to achieve good running speed and accuracy in low-end devices.
60+
Here we provide a number of computer vision algorithms such as **object detection, image classification, image segmentation and pose estimation**. However, these algorithms cannot run on low-cost hardwares. SSCMA optimizes these computer vision algorithms to achieve good running speed and accuracy in low-end devices.
2561

2662
### ⏱️ Scenario Specific
2763

2864
SSCMA provides customized scenarios for specific production environments, such as identification of analog instruments, traditional digital meters, and audio classification. We will continue to add more algorithms for specified scenarios in the future.
2965

30-
## Features
66+
## What's New
3167

32-
### 🤝 User-friendly
68+
SSCMA is always committed to providing the cutting-edge AI algorithms for best performance and accuracy, along with the community feedbacks, we keeps updating and optimizing the algorithms to meet the actual needs of users, here are some of the latest updates:
3369

34-
SSCMA provides a user-friendly platform that allows users to easily perform training on collected data, and to better understand the performance of algorithms through visualizations generated during the training process.
70+
### 🔥 YOLO-World, MobileNetV4 and lighter SSCMA (Comming Soon)
3571

36-
### 🔋 Models with low computing power and high performance
72+
We are working on the latest [YOLO-World](https://github.com/AILab-CVC/YOLO-World), [MobileNetV4](https://arxiv.org/abs/2404.10518) algorithms for embedded devices, we are also refactoring the SSCMA with less dependencies to make it more lightweight and easier to use, please stay tuned for the latest updates.
3773

38-
SSCMA focuses on end-side AI algorithm research, and the algorithm models can be deployed on microprocessors, similar to [ESP32](https://www.espressif.com.cn/en/products/socs/esp32), some [Arduino](https://arduino.cc) development boards, and even in embedded SBCs such as [Raspberry Pi](https://www.raspberrypi.org).
74+
### YOLOv8, YOLOv8 Pose, Nvidia Tao Models and ByteTrack
3975

40-
### 🗂️ Supports multiple formats for model export
76+
With [SSCMA-Micro](https://github.com/Seeed-Studio/SSCMA-Micro), now you can deploy the latest [YOLOv8](https://github.com/ultralytics/ultralytics), YOLOv8 Pose, [Nvidia TAO Models](https://docs.nvidia.com/tao/tao-toolkit/text/model_zoo/cv_models/index.html) on microcontrollers. we also added the [ByteTrack](https://github.com/ifzhang/ByteTrack) algorithm to enable real-time object tracking on low-cost hardwares.
4177

42-
[TensorFlow Lite](https://www.tensorflow.org/lite) is mainly used in microcontrollers, while [ONNX](https://onnx.ai) is mainly used in devices with Embedded Linux. There are some special formats such as [TensorRT](https://developer.nvidia.com/tensorrt), [OpenVINO](https://docs.openvino.ai) which are already well supported by OpenMMLab. SSCMA has added TFLite model export for microcontrollers, which can be directly converted to [TensorRT](https://developer.nvidia.com/tensorrt), [UF2](https://github.com/microsoft/uf2) format and drag-and-drop into the device for deployment.
78+
<div align="center"><img width="98%" src="https://files.seeedstudio.com/sscma/docs/images/SSCMA-WebCam-Tracking.gif"/></div>
79+
80+
### Swift YOLO
81+
82+
We implemented a lightweight object detection algorithm called Swift YOLO, which is designed to run on low-cost hardware with limited computing power. The visualization tool, model training and export command-line interface has refactored now.
83+
84+
<div align="center"><img width="98%" src="https://files.seeedstudio.com/sscma/docs/static/esp32/images/person_detection.png"/></div>
85+
86+
### Meter Recognition
87+
88+
Meter is a common instrument in our daily life and industrial production, such as analog meters, digital meters, etc. SSCMA provides meter recognition algorithms that can be used to identify the readings of various meters.
4389

44-
## Application Examples
90+
<div align="center"><img width="98%" src="https://files.seeedstudio.com/sscma/docs/static/grove/images/pfld_meter.gif"/></div>
4591

46-
### Object Detection
92+
## Benchmarks
4793

48-
<div align="center"><img width="800" src="https://files.seeedstudio.com/sscma/docs/static/esp32/images/person_detection.png"/></div>
94+
SSCMA aims to provide the best performance and accuracy for embedded devices, here are some benchmarks for the latest algorithms:
4995

50-
### Pointer Meter Recognition
96+
<div align="center"><img width="98%" src="https://files.seeedstudio.com/sscma/docs/images/SSCMA-Swift-YOLO.png"/></div>
5197

52-
<div align="center"><img width="800" src="https://files.seeedstudio.com/sscma/docs/static/grove/images/pfld_meter.gif"/></div>
98+
**Note: The bechmark mainly includes 2 architectures, each architecture has 3 models with different sizes (inputs `[192, 224, 320]`, parameters may various), represented by the size of the point in the graph. Quanitizied models are also included in the benchmark, all latency is measured on NVIDIA A100.*
5399

54-
### Digital Meter Recognition
100+
## The SSCMA Toolchains
55101

56-
<div align="center"><img width="800" src="https://files.seeedstudio.com/sscma/docs/static/grove/images/digital_meter.gif"/></div>
102+
SSCMA provides a complete toolchain for users to easily deploy AI models on low-cost hardwares, including:
57103

58-
More application examples can be found in [Model Zoo](https://github.com/Seeed-Studio/sscma-model-zoo)
104+
- [SSCMA-Model-Zoo](https://github.com/Seeed-Studio/sscma-model-zoo) SSCMA Model Zoo provide a series of pre-trained models for different application scenarios for you to use.
105+
- [SSCMA-Micro](https://github.com/Seeed-Studio/SSCMA-Micro) A cross-platform framework that deploys and applies SSCMA models to microcontrol devices.
106+
- [Seeed-Arduino-SSCMA](https://github.com/Seeed-Studio/Seeed_Arduino_SSCMA) Arduino library for devices supporting the SSCMA-Micro firmware.
107+
- [SSCMA-Web-Toolkit](https://seeed-studio.github.io/SenseCraft-Web-Toolkit) A web-based tool that updates the device's firmware, SSCMA model, and parameters.
108+
- [Python-SSCMA](https://github.com/Seeed-Studio/python-sscma) A Python library for interacting with microcontrollers using SSCMA-Micro, and for higher-level deep learning applications.
59109

60110
## Acknowledgement
61111

62-
SSCMA referenced the following projects:
112+
SSCMA is a united effort of many developers and contributors, we would like to thank the following projects and organizations for their contributions which SSCMA referenced to implement:
63113

64114
- [OpenMMLab](https://openmmlab.com/)
65115
- [ONNX](https://github.com/onnx/onnx)

0 commit comments

Comments
 (0)