Skip to content

Commit 20354fb

Browse files
Merge pull request #1522 from annietllnd/tiny-ml
WIP TinyML LP
2 parents 6977877 + 1b52dec commit 20354fb

File tree

7 files changed

+96
-179
lines changed

7 files changed

+96
-179
lines changed

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/Overview-1.md

Lines changed: 38 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -6,42 +6,64 @@ weight: 2
66
layout: learningpathall
77
---
88

9-
TinyML represents a significant shift in machine learning deployment.
9+
This Learning Path is about TinyML. It serves as a starting point for learning how cutting-edge AI technologies may be put on even the smallest of devices, making Edge AI more accessible and efficient. You will learn how to setup on your host machine and target device to facilitate compilation and ensure smooth integration across all devices.
1010

11-
Unlike traditional machine learning, which typically depends on cloud-based servers or high-powered hardware, TinyML is tailored to function on devices with limited resources, constrained memory, low power, and less processing capabilities.
11+
In this section, you get an overview of the domain with real-life use-cases and available devices.
1212

13-
TinyML has gained popularity because it enables AI applications to operate in real-time, directly on the device, with minimal latency, enhanced privacy, and the ability to work offline.
13+
## Overview
14+
TinyML represents a significant shift in machine learning deployment. Unlike traditional machine learning, which typically depends on cloud-based servers or high-powered hardware, TinyML is tailored to function on devices with limited resources, constrained memory, low power, and less processing capabilities. TinyML has gained popularity because it enables AI applications to operate in real-time, directly on the device, with minimal latency, enhanced privacy, and the ability to work offline. This shift opens up new possibilities for creating smarter and more efficient embedded systems.
1415

15-
This shift opens up new possibilities for creating smarter and more efficient embedded systems.
16+
### Benefits and applications
1617

17-
## Module Overview
18+
The advantages of TinyML match up well with the Arm architecture, which is widely used in IoT, mobile devices, and edge AI deployments.
1819

19-
This Learning Path is about TinyML, applying machine learning to devices with limited resources like microcontrollers. It serves as a starting point for learning how cutting-edge AI technologies may be put on even the smallest of devices, making Edge AI more accessible and efficient.
20+
Here are some key benefits of TinyML on Arm:
2021

21-
You will learn how to setup on your host machine and target device to facilitate compilation and ensure smooth integration across all devices.
2222

23-
## Examples of Arm-based devices and applications
23+
- **Power Efficiency**: TinyML models are designed to be extremely power-efficient, making them ideal for battery-operated devices like sensors, wearables, and drones.
2424

25-
There are many devices you can use for TinyML projects. Some of them are listed below.
25+
- **Low Latency**: Because the AI processing happens on-device, there's no need to send data to the cloud, reducing latency and enabling real-time decision-making.
2626

27-
### Raspberry Pi 4 and 5
27+
- **Data Privacy**: With on-device computation, sensitive data remains local, providing enhanced privacy and security. This is particularly crucial in healthcare and personal devices.
28+
29+
- **Cost-Effective**: Arm devices, which are cost-effective and scalable, can now handle sophisticated machine learning tasks, reducing the need for expensive hardware or cloud services.
30+
31+
- **Scalability**: With billions of Arm devices in the market, TinyML is well-suited for scaling across industries, enabling widespread adoption of AI at the edge.
32+
33+
TinyML is being deployed across multiple industries, enhancing everyday experiences and enabling groundbreaking solutions. The table below contains a few examples of TinyML applications.
34+
35+
| Area | Device, Arm IP | Description |
36+
| ------ | ------- | ------------ |
37+
| Healthcare | Fitbit Charge 5, Cortex-M | Monitor vital signs such as heart rate, detect arrhythmias, and provide real-time feedback. |
38+
| Agriculture | OpenAg, Cortex-M | Monitor soil moisture and optimize water usage. |
39+
| Home automation | Arlo, Cortex-A | Detect objects and people, trigger alerts or actions while saving bandwidth and improving privacy. |
40+
| Industrial IoT | Siemens, Cortex-A | Analyze vibration patterns in machinery to predict when maintenance is needed and prevent breakdowns. |
41+
| Wildlife conservation | Conservation X, Cortex-M | Identify animal movements or detect poachers in remote areas without relying on external power sources. |
42+
43+
### Examples of Arm-based devices
44+
45+
There are many Arm-based off-the-shelf devices you can use for TinyML projects. Some of them are listed below, but the list is not exhaustive.
46+
47+
#### Raspberry Pi 4 and 5
2848

2949
Raspberry Pi single-board computers are excellent for prototyping TinyML projects. They are commonly used for prototyping machine learning projects at the edge, such as in object detection and voice recognition for home automation.
3050

31-
### NXP i.MX RT microcontrollers
51+
#### NXP i.MX RT microcontrollers
3252

3353
NXP i.MX RT microcontrollers are low-power microcontrollers that can handle complex TinyML tasks while maintaining energy efficiency, making them ideal for applications like wearable healthcare devices and environmental sensors.
3454

35-
### STM32 microcontrollers
55+
#### STM32 microcontrollers
3656

3757
STM32 microcontrollers are used in industrial IoT applications for predictive maintenance. These microcontrollers are energy-efficient and capable of running TinyML models for real-time anomaly detection in factory machinery.
3858

39-
### Arduino Nano 33 BLE Sense
59+
#### Arduino Nano 33 BLE Sense
4060

4161
The Arduino Nano, equipped with a suite of sensors, supports TinyML and is ideal for small-scale IoT applications, such as detecting environmental changes and movement patterns.
4262

43-
### Edge Impulse
63+
#### Edge Impulse
64+
65+
In addition to hardware, there are software platforms that can help you build TinyML applications.
4466

45-
In addition to hardware, there are software platforms that can help you build TinyML applications.
67+
Edge Impulse platform offers a suite of tools for developers to build and deploy TinyML applications on Arm-based devices. It supports devices like Raspberry Pi, Arduino, and STMicroelectronics boards.
4668

47-
Edge Impulse platform offers a suite of tools for developers to build and deploy TinyML applications on Arm-based devices. It supports devices like Raspberry Pi, Arduino, and STMicroelectronics boards.
69+
Now that you have an overview of the subject, move on to the next section where you will set up an environment on your host machine.

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/_index.md

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -7,23 +7,22 @@ cascade:
77

88
minutes_to_complete: 40
99

10-
who_is_this_for: This is an introductory topic for developers, engineers, and data scientists who are new to TinyML and interested in exploring its potential for edge AI. You will learn how to get started using PyTorch and ExecuTorch for TinyML.
10+
who_is_this_for: This is an introductory topic for developers, engineers, and data scientists who are new to TinyML and interested in exploring its potential for edge AI. You will learn how to get started using PyTorch and ExecuTorch for TinyML.
1111

12-
learning_objectives:
13-
- Identify TinyML and how it's different from the AI you might be used to.
12+
learning_objectives:
13+
- Identify how TinyML is different from other AI domains.
1414
- Understand the benefits of deploying AI models on Arm-based edge devices.
1515
- Select Arm-based devices for TinyML.
16-
- Identify real-world use cases demonstrating the impact of TinyML.
1716
- Install and configure a TinyML development environment.
18-
- Set up a cross-compilation environment on your host machine.
1917
- Perform best practices for ensuring optimal performance on constrained edge devices.
2018

2119

2220
prerequisites:
2321
- Basic knowledge of machine learning concepts.
24-
- Understanding of IoT and embedded systems.
2522
- A Linux host machine or VM running Ubuntu 22.04 or higher.
26-
23+
- A [Grove Vision AI Module](https://wiki.seeedstudio.com/Grove-Vision-AI-Module/) **or** an Arm license to run the Corstone-300 Fixed Virtual Platform (FVP).
24+
25+
2726
author_primary: Dominica Abena O. Amanfo
2827

2928
### Tags
@@ -32,7 +31,7 @@ subjects: ML
3231
armips:
3332
- Cortex-A
3433
- Cortex-M
35-
34+
3635
operatingsystems:
3736
- Linux
3837

@@ -42,7 +41,7 @@ tools_software_languages:
4241
- Python
4342
- PyTorch
4443
- ExecuTorch
45-
- Arm Compute Library
44+
- Arm Compute Library
4645
- GCC
4746
- Edge Impulse
4847
- Node.js

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/benefits-3.md

Lines changed: 0 additions & 58 deletions
This file was deleted.

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/build-model-8.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,10 @@ weight: 7 # 1 is first, 2 is second, etc.
88
layout: "learningpathall"
99
---
1010

11-
With our Environment ready, you can create a simple program to test the setup.
11+
TODO connect this part with the FVP/board?
12+
With our environment ready, you can create a simple program to test the setup.
1213

13-
This example defines a small feedforward neural network for a classification task. The model consists of 2 linear layers with ReLU activation in between.
14+
This example defines a small feedforward neural network for a classification task. The model consists of 2 linear layers with ReLU activation in between.
1415

1516
Use a text editor to create a file named `simple_nn.py` with the following code:
1617

@@ -26,7 +27,7 @@ class SimpleNN(torch.nn.Module):
2627
self.fc1 = torch.nn.Linear(input_size, hidden_size)
2728
self.relu = torch.nn.ReLU()
2829
self.fc2 = torch.nn.Linear(hidden_size, output_size)
29-
30+
3031
def forward(self, x):
3132
out = self.fc1(x)
3233
out = self.relu(out)
@@ -73,7 +74,7 @@ Model successfully exported to simple_nn.pte
7374

7475
The model is saved as a .pte file, which is the format used by ExecuTorch for deploying models to the edge.
7576

76-
Run the ExecuTorch version, first build the executable:
77+
Run the ExecuTorch version, first build the executable:
7778

7879
```console
7980
# Clean and configure the build system
@@ -105,7 +106,7 @@ I 00:00:00.006635 executorch:executor_runner.cpp:129] Setting up planned buffer
105106
I 00:00:00.007225 executorch:executor_runner.cpp:152] Method loaded.
106107
I 00:00:00.007237 executorch:executor_runner.cpp:162] Inputs prepared.
107108
I 00:00:00.012885 executorch:executor_runner.cpp:171] Model executed successfully.
108-
I 00:00:00.012896 executorch:executor_runner.cpp:175] 1 outputs:
109+
I 00:00:00.012896 executorch:executor_runner.cpp:175] 1 outputs:
109110
Output 0: tensor(sizes=[1, 2], [-0.105369, -0.178723])
110111
```
111112

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/env-setup-5.md

Lines changed: 23 additions & 87 deletions
Original file line numberDiff line numberDiff line change
@@ -2,127 +2,63 @@
22
# User change
33
title: "Environment Setup on Host Machine"
44

5-
weight: 4 # 1 is first, 2 is second, etc.
5+
weight: 3
66

77
# Do not modify these elements
88
layout: "learningpathall"
99
---
10-
## Before you begin
1110

12-
You will use a Linux computer to run PyTorch and ExecuTorch to prepare a TinyML model to run on edge devices.
11+
In this section, you will prepare a development environment to compile the model. These instructions have been tested on Ubuntu 22.04, 24.04 and on Windows Subsystem for Linux (WSL).
1312

14-
The instructions are for Ubuntu 22.04 or newer.
13+
## Install dependencies
1514

16-
You also need the [Grove Vision AI Module](https://wiki.seeedstudio.com/Grove-Vision-AI-Module/). If you don't have the board you can use the Corstone-300 Fixed Virtual Platform (FVP) instead.
15+
Python3 is required and comes installed with Ubuntu, but some additional packages are needed.
1716

18-
{{% notice Note %}}
19-
Note that the Corstone-300 FVP is not available for the Arm architecture so your host machine needs to x86_64.
20-
{{% /notice %}}
21-
22-
The instructions have been tested on:
23-
- Arm-based cloud instances running Ubuntu 22.04.
24-
- Desktop computer with Ubuntu 24.04.
25-
- Windows Subsystem for Linux (WSL).
26-
27-
The host machine is where you will perform most of your development work, especially compiling code for the target Arm devices.
28-
29-
## Install Python
30-
31-
Python 3 is included in Ubuntu, but some additional packages are needed.
32-
33-
```console
17+
```bash
3418
sudo apt update
35-
sudo apt install python-is-python3 gcc g++ make -y
19+
sudo apt install python-is-python3 python3-dev python3-venv gcc g++ make -y
3620
```
3721

38-
## Install PyTorch
39-
40-
Create a Python virtual environment using Miniconda.
41-
42-
For Arm Linux:
22+
## Create a virtual environment
4323

44-
```console
45-
curl -O https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-aarch64.sh
46-
sh ./Miniconda3-latest-Linux-aarch64.sh -b
47-
eval "$($HOME/miniconda3/bin/conda shell.bash hook)"
48-
conda --version
49-
```
50-
51-
For x86_64 Linux:
24+
Create a Python virtual environment using `python venv`.
5225

5326
```console
54-
curl -O https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
55-
sh ./Miniconda3-latest-Linux-x86_64.sh -b
56-
eval "$($HOME/miniconda3/bin/conda shell.bash hook)"
57-
conda --version
27+
python3 -m venv $HOME/executorch-venv
28+
source $HOME/executorch-venv/bin/activate
5829
```
59-
60-
Activate the Python virtual environment:
61-
62-
```bash
63-
conda create -yn executorch python=3.10.0
64-
conda activate executorch
65-
```
66-
6730
The prompt of your terminal now has (executorch) as a prefix to indicate the virtual environment is active.
6831

6932

7033
## Install Executorch
7134

72-
From within the Python virtual environment, run the commands below to download the ExecuTorch repository and install the required packages:
35+
From within the Python virtual environment, run the commands below to download the ExecuTorch repository and install the required packages:
7336

7437
``` bash
75-
# Clone the ExecuTorch repo from GitHub
76-
git clone --branch v0.3.0 https://github.com/pytorch/executorch.git
38+
cd $HOME
39+
git clone https://github.com/pytorch/executorch.git
7740
cd executorch
41+
```
7842

79-
# Update and pull submodules
43+
Run a few commands to set up the ExecuTorch internal dependencies.
44+
```bash
8045
git submodule sync
8146
git submodule update --init
8247

83-
# Install ExecuTorch pip package and its dependencies, as well as
84-
# development tools like CMake.
8548
./install_requirements.sh
8649
```
8750

88-
## Install Edge Impulse CLI
89-
90-
1. Create an [Edge Impulse Account](https://studio.edgeimpulse.com/signup) and sign in.
91-
92-
2. Install the Edge Impulse CLI tools in your terminal
93-
94-
The Edge Impulse CLI tools require Node.js.
95-
96-
```console
97-
sudo apt install nodejs npm -y
98-
```
99-
100-
Confirm `node` is available by running:
101-
102-
```console
103-
node -v
104-
```
105-
106-
Your version is printed, for example:
107-
108-
```output
109-
v18.19.1
110-
```
111-
112-
Install the Edge Impulse CLI using NPM:
113-
114-
```console
115-
npm install -g edge-impulse-cli
116-
```
117-
118-
3. Install Screen to use with edge devices
51+
{{% notice Note %}}
52+
If you run into an issue of `buck` running in a stale environment, reset it by running the following instructions.
11953

120-
```console
121-
sudo apt install screen -y
54+
```bash
55+
ps aux | grep buck
56+
pkill -f buck
12257
```
58+
{{% /notice %}}
12359

12460
## Next Steps
12561

126-
If you don't have the Grove AI vision board and want to use the Corstone-300 FVP proceed to [Environment Setup Corstone-300 FVP](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/env-setup-6-fvp/)
62+
If you don't have the Grove AI vision board, use the Corstone-300 FVP proceed to [Environment Setup Corstone-300 FVP](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/env-setup-6-fvp/)
12763

12864
If you have the Grove board proceed o to [Setup on Grove - Vision AI Module V2](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/setup-7-grove/)

0 commit comments

Comments
 (0)