Skip to content

Commit 9d7a53f

Browse files
authored
Merge pull request #1701 from dominica-of/main
Edge AI with PyTorch & ExecuTorch - Tiny Sentiment Analysis on Arm
2 parents c53d74a + a0ef1c8 commit 9d7a53f

File tree

5 files changed

+503
-0
lines changed

5 files changed

+503
-0
lines changed
Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
---
2+
title: Edge AI with PyTorch & ExecuTorch - Tiny Sentiment Analysis on Arm
3+
4+
minutes_to_complete: 90
5+
6+
who_is_this_for: This topic is for machine learning engineers, embedded AI developers, and researchers interested in deploying TinyML models for NLP on Arm-based edge devices using PyTorch and ExecuTorch.
7+
8+
learning_objectives:
9+
- Train a custom CNN-based sentiment classification model implemented in PyTorch.
10+
- Optimize and convert the model using ExecuTorch for Arm-based edge devices.
11+
- Deploy and run inference on the Corstone-320 FVP.
12+
13+
prerequisites:
14+
- Basic knowledge of machine learning concepts.
15+
- It is advised to complete The Learning Path, [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm) before starting this learning path.
16+
- Familiarity with Python and PyTorch.
17+
- A Linux host machine or VM running Ubuntu 22.04 or higher.
18+
- An Arm license to run the examples on the Corstone-320 Fixed Virtual Platform (FVP), for hands-on deployment.
19+
20+
21+
author: Dominica Abena O. Amanfo
22+
23+
### Tags
24+
skilllevels: Intermediate
25+
subjects: ML
26+
armips:
27+
- Cortex-A
28+
tools_software_languages:
29+
- tinyML
30+
- CNN
31+
- PyTorch
32+
- ExecuTorch
33+
34+
operatingsystems:
35+
- Linux
36+
37+
38+
further_reading:
39+
- resource:
40+
title: Run Llama 3 on a Raspberry Pi 5 using ExecuTorch
41+
link: /learning-paths/embedded-and-microcontrollers/rpi-llama3
42+
type: website
43+
- resource:
44+
title: ExecuTorch Examples
45+
link: https://github.com/pytorch/executorch/blob/main/examples/README.md
46+
type: website
47+
48+
49+
50+
### FIXED, DO NOT MODIFY
51+
# ================================================================================
52+
weight: 1 # _index.md always has weight of 1 to order correctly
53+
layout: "learningpathall" # All files under learning paths have this same wrapper
54+
learning_path_main_page: "yes" # This should be surfaced when looking for related content. Only set for _index.md of learning path content.
55+
---
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
# ================================================================================
3+
# FIXED, DO NOT MODIFY THIS FILE
4+
# ================================================================================
5+
weight: 21 # Set to always be larger than the content in this path to be at the end of the navigation.
6+
title: "Next Steps" # Always the same, html page title.
7+
layout: "learningpathall" # All files under learning paths have this same wrapper for Hugo processing.
8+
---
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
---
2+
title: Environment Setup
3+
weight: 2
4+
5+
### FIXED, DO NOT MODIFY
6+
layout: learningpathall
7+
---
8+
9+
## Overview
10+
In this course, you will learn how to train and run inference using a Tiny Sentiment Classifier. You'll deploy the model on the Arm Corstone-320 FVP for sentiment analysis.
11+
12+
We will train a lightweight convolutional neural network (CNN)-based sentiment classifier using synthetic text data. This model is optimized for small devices, using embedding layers and 1D convolutions for efficient text classification.
13+
14+
15+
## Environment Setup
16+
Setup your development environment for TinyML by following the first 3 chapters of the [Introduction to TinyML on Arm using PyTorch and ExecuTorch](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm) Learning Path (LP).
17+
18+
19+
If you just followed the LP above, you should already have your virtual environment activated. If not, activate it using:
20+
21+
```console
22+
source $HOME/executorch-venv/bin/activate
23+
```
24+
The prompt of your terminal now has `(executorch-venv)` as a prefix to indicate the virtual environment is active.
25+
26+
Run the commands below to install the dependencies.
27+
28+
```bash
29+
pip install argparse json
30+
```
31+
You are now ready to build the model
32+
33+

0 commit comments

Comments
 (0)