You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
title: Edge AI with PyTorch & ExecuTorch - Tiny Rock-Paper-Scissors on Arm
2
+
title: "Edge AI on Arm with PyTorch and ExecuTorch: Tiny Rock-Paper-Scissors"
3
3
4
4
minutes_to_complete: 60
5
5
6
-
who_is_this_for: This learning path is for machine learning developers interested in deploying TinyML models on Arm-based edge devices. You will learn how to train and deploy a machine learning model for the classic game "Rock-Paper-Scissors" on edge devices. You'll use PyTorch and ExecuTorch, frameworks designed for efficient on-device inference, to build and run a small-scale computer vision model.
7
-
6
+
who_is_this_for: This is an introductory topic for machine learning developers who want to deploy TinyML models on Arm-based edge devices using PyTorch and ExecuTorch.
8
7
9
8
learning_objectives:
10
-
- Train a small Convolutional Neural Network (CNN) for image classification using PyTorch.
11
-
- Understand how to use synthetic data generation for training a model when real-world data is limited.
12
-
- Optimize and convert a PyTorch model into an ExecuTorch program (.pte) for Arm-based devices.
13
-
- Run the trained model on a local machine to play an interactive mini-game, demonstrating model inference.
14
-
9
+
- Train a small Convolutional Neural Network (CNN) for image classification using PyTorch
10
+
- Use synthetic data generation for training a model when real data is limited
11
+
- Convert and optimize a PyTorch model to an ExecuTorch program (`.pte`) for Arm-based devices
12
+
- Run the trained model locally as an interactive mini-game to demonstrate inference
15
13
16
14
prerequisites:
17
-
- A basic understanding of machine learning concepts.
18
-
- Familiarity with Python and the PyTorch library.
19
-
- Having completed [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm).
20
-
- An x86 Linux host machine or VM running Ubuntu 22.04 or higher.
15
+
- Basic understanding of machine learning concepts
16
+
- Familiarity with Python and the PyTorch library
17
+
- Completion of the Learning Path [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm)
18
+
- An x86 Linux host machine or VM running Ubuntu 22.04 or later
21
19
22
20
author: Dominica Abena O. Amanfo
23
21
24
22
### Tags
25
23
skilllevels: Introductory
26
24
subjects: ML
27
25
armips:
28
-
- Cortex-M
29
-
- Ethos-U
26
+
- Cortex-M
27
+
- Ethos-U
30
28
tools_software_languages:
31
29
- tinyML
32
30
- Computer Vision
@@ -36,23 +34,21 @@ tools_software_languages:
36
34
- ExecuTorch
37
35
38
36
operatingsystems:
39
-
- Linux
37
+
- Linux
40
38
41
39
further_reading:
42
-
- resource:
43
-
title: Run Llama 3 on a Raspberry Pi 5 using ExecuTorch
Copy file name to clipboardExpand all lines: content/learning-paths/embedded-and-microcontrollers/training-inference-pytorch/env-setup-1.md
+20-15Lines changed: 20 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,40 +6,45 @@ weight: 2
6
6
layout: learningpathall
7
7
---
8
8
9
-
## Overview
10
-
This learning path (LP) is a direct follow-up to the [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm) learning path. While the previous one introduced you to the core concepts and the toolchain, this one puts that knowledge into practice with a fun, real-world example. You will move from the simple [Feedforward Neural Network](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm/4-build-model) in the previous LP, to a more practical computer vision task: A tiny Rock-Paper-Scissors game, to demonstrate how these tools can be used to solve a tangible problem and run efficiently on Arm-based edge devices.
9
+
## Environment setup for tiny rock-paper-scissors on Arm (PyTorch + ExecuTorch)
10
+
11
+
12
+
13
+
This Learning Path is a direct follow-up to [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm). While the previous Learning Path introduced the core concepts and toolchain, this one puts that knowledge into practice with a small, real-world example. You move from a simple [Feedforward Neural Network](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm/4-build-model) to a practical computer vision task: a tiny Rock-Paper-Scissors game that runs efficiently on Arm-based edge devices.
11
14
12
15
You will train a lightweight CNN to classify images of the letters R, P, and S as "rock," "paper," or "scissors." The script uses a synthetic data renderer to create a large dataset of these images with various transformations and noise, eliminating the need for a massive real-world dataset.
13
16
14
17
### What is a Convolutional Neural Network (CNN)?
15
-
A Convolutional Neural Network (CNN) is a type of deep neural network primarily used for analyzing visual imagery. Unlike traditional neural networks, CNNs are designed to process pixel data by using a mathematical operation called **convolution**. This allows them to automatically and adaptively learn spatial hierarchies of features from input images, from low-level features like edges and textures to high-level features like shapes and objects.
18
+
A Convolutional Neural Network (CNN) is a type of deep neural network primarily used for analyzing visual imagery. Unlike traditional neural networks, CNNs are designed to process pixel data by using a mathematical operation called convolution. This allows them to automatically and adaptively learn spatial hierarchies of features from input images, from low-level features like edges and textures to high-level features like shapes and objects.
16
19
17
20
A convolutional neural network (CNN) is a deep neural network designed to analyze visual data using the **convolution** operation. CNNs learn spatial hierarchies of features - from edges and textures to shapes and objects - directly from pixels.
18
21
19
-
CNNs are the backbone of many modern computer vision applications, including:
-**Image Classification:** Identifying the main object in an image, like classifying a photo as a "cat" or "dog".
22
-
-**Object Detection:** Locating specific objects within an image and drawing a box around them.
23
-
-**Facial Recognition:** Identifying and verifying individuals based on their faces.
25
+
Common CNN applications include:
24
26
25
-
For the Rock-Paper-Scissors game, you'll use a tiny CNN to classify images of the letters R, P, and S as the corresponding hand gestures.
27
+
- Image classification: identify the main object in an image, such as classifying a photo as a cat or dog
28
+
- Object detection: locate specific objects in an image and draw bounding boxes
29
+
- Facial recognition: identify or verify individuals based on facial features
26
30
31
+
For the rock-paper-scissors game, you use a tiny CNN to classify the letters R, P, and S as the corresponding hand gestures.
27
32
33
+
## Environment setup
28
34
29
-
## Environment Setup
30
-
To get started, follow the first three chapters of the [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm) Learning Path. This will set up your development environment and install the necessary tools. Return to this LP once you've run the `./examples/arm/run.sh` script in the ExecuTorch repository.
35
+
To get started, complete the first three sections of [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm). This setup prepares your development environment and installs the required tools. Return here after running the `./examples/arm/run.sh` script in the ExecuTorch repository.
31
36
32
-
If you just followed the LP above, you should already have your virtual environment activated. If not, activate it using:
37
+
If you just completed the earlier Learning Path, your virtual environment should still be active. If not, activate it:
33
38
34
39
```console
35
40
source $HOME/executorch-venv/bin/activate
36
41
```
37
42
The prompt of your terminal now has `(executorch-venv)` as a prefix to indicate the virtual environment is active.
38
43
39
-
Run the commands below to install the dependencies.
0 commit comments