Skip to content

Commit b1da748

Browse files
committed
Updated structure
1 parent 262c08a commit b1da748

File tree

4 files changed

+10
-20
lines changed

4 files changed

+10
-20
lines changed

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/env-setup-5.md

Lines changed: 3 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -18,13 +18,7 @@ The host machine is where you will perform most of your development work, especi
1818

1919
- The Ubuntu version should be `20.04 or higher`.
2020
- If you do not have the board, the `x86_64` architecture must be used because the Corstone-300 FVP is not currently available for the Arm architecture.
21-
- Also, though Executorch supports Windows via WSL, it is limited in resource.
22-
23-
24-
### Corstone-300 FVP Setup for ExecuTorch
25-
26-
To install and set up the Corstone-300 FVP and ExecuTorch on your machine, refer to [Building and Running ExecuTorch with ARM Ethos-U Backend](https://pytorch.org/executorch/stable/executorch-arm-delegate-tutorial.html). Follow this tutorial till the **"Install the TOSA reference model"** Section. It should be the last thing you do from this tutorial.
27-
21+
- Though Executorch supports Windows via WSL, it is limited in resource.
2822

2923

3024
## Install Executorch
@@ -65,5 +59,5 @@ sudo apt install screen
6559
```
6660

6761
## Next Steps
68-
1. If you don't have access to the physical board: Skip to [Environment Setup Corstone-300 FVP](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/setup-6-FVP.md)
69-
2. If you have access to the board: Skip to [Setup on Grove - Vision AI Module V2](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/setup-6-Grove.md)
62+
1. If you don't have access to the physical board: Go to [Environment Setup Corstone-300 FVP](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/env-setup-6-fvp/)
63+
2. If you have access to the board: Go to [Setup on Grove - Vision AI Module V2](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/setup-7-grove/)

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/env-setup-6-FVP.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,4 +14,4 @@ To install and set up the Corstone-300 FVP on your machine, refer to [Building a
1414

1515

1616
## Next Steps
17-
1. Go to [Build a Simple PyTorch Model](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/build-model-8.md) to test your environment setup.
17+
1. Go to [Build a Simple PyTorch Model](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/build-model-8/) to test your environment setup.

content/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/setup-7-Grove.md

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,7 @@ weight: 8 # 1 is first, 2 is second, etc.
88
layout: "learningpathall"
99
---
1010
## Before you begin
11-
Only follow this part of the tutorial if you have the board.
12-
13-
Due to its constrained environment, we'll focus on lightweight, optimized tools and models (which will be introduced in the next learning path).
11+
Only follow this part of the tutorial if you have the board. Due to its constrained environment, we'll focus on lightweight, optimized tools and models (which will be introduced in the next learning path).
1412

1513

1614
### Compilers
@@ -52,10 +50,8 @@ edge-impulse-daemon
5250
```
5351
Follow the prompts to log in.
5452

55-
5. Verify your board is connected
56-
57-
If successful, you should see your Grove - Vision AI Module V2 under 'Devices' in Edge Impulse.
53+
5. If successful, you should see your Grove - Vision AI Module V2 under 'Devices' in Edge Impulse.
5854

5955

6056
## Next Steps
61-
1. Go to [Build a Simple PyTorch Model](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/build-model-8.md) to test your environment setup.
57+
1. Go to [Build a Simple PyTorch Model](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/build-model-8/) to test your environment setup.
Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,18 @@
11
---
22
title: Troubleshooting and Best Practices
3-
weight: 7
3+
weight: 10
44

55
### FIXED, DO NOT MODIFY
66
layout: learningpathall
77
---
88
## Troubleshooting
99
- If you encounter permission issues, try running the commands with sudo.
1010
- Ensure your Grove - Vision AI Module V2 is properly connected and recognized by your computer.
11-
- If Edge Impulse CLI fails to detect your device, try unplugging and replugging the USB cable.
11+
- If Edge Impulse CLI fails to detect your device, try unplugging, hold the **Boot button** and replug the USB cable. Release the button once you replug.
1212

1313
## Best Practices
1414
- Always cross-compile your code on the host machine to ensure compatibility with the target Arm device.
1515
- Utilize model quantization techniques to optimize performance on constrained devices like the Grove - Vision AI Module V2.
1616
- Regularly update your development environment and tools to benefit from the latest improvements in TinyML and edge AI technologies
1717

18-
You've now set up your environment for TinyML development on the Grove - Vision AI Module V2. In the next modules, we'll explore data collection, model training, and deployment using PyTorch v2.0 and Executorch.
18+
You've now set up your environment for TinyML development, and tested a PyTorch and ExecuTorch Neural Netrowk.

0 commit comments

Comments
 (0)