You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/install-guides/fm_fvp/fvp.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,15 +11,20 @@ multi_install: false # Set to true if first page of multi-page artic
11
11
multitool_install_part: true # Set to true if a sub-page of a multi-page article, else false
12
12
layout: installtoolsall # DO NOT MODIFY. Always true for tool install articles
13
13
---
14
-
Arm Fixed Virtual Platforms (FVPs) are provided as a library of ready to use platforms.
15
14
16
-
{{% notice Arm Development Tools%}}
17
-
An appropriate subset of the FVP library is installed with [Arm Development Studio](/install-guides/armds) and [Keil MDK](/install-guides/mdk) Professional Edition.
15
+
{{% notice Note %}}
16
+
Arm Fixed Virtual Platforms (FVPs) were available as a library of ready to use platforms (and as a component of Arm Development Studio) up until version 11.28.
17
+
18
+
From 11.29 onwards the FVPs are provided solely as part of Arm Development Studio.
19
+
20
+
This install guide is only applicable to the legacy FVP library.
21
+
22
+
See the [Arm Development Studio Install Guide](/install-guides/armds) and the [Introduction to FVPs](https://developer.arm.com/documentation/110379/1129/Introduction-to-FVPs) documentation.
18
23
{{% /notice %}}
19
24
20
25
## Download installer packages
21
26
22
-
You can download the FVP library installer from the [Product Download Hub](https://developer.arm.com/downloads/view/FM000A).
27
+
You can download the FVP library installer from the [Product Download Hub](https://developer.arm.com/downloads/view/FMFVP).
23
28
24
29
Linux (AArch64 and x86) and Windows (x86 only) hosts are supported.
25
30
@@ -47,11 +52,6 @@ For full list of available options, use:
47
52
48
53
FVPs are license managed. License setup instructions are available in the [Arm Licensing install guide](/install-guides/license).
49
54
50
-
{{% notice Arm Development Tools%}}
51
-
The FVPs provided with Arm Development Studio and/or Keil MDK Professional Edition use the license of that product, not that of the FVP library.
52
-
{{% /notice %}}
53
-
54
-
55
55
## Verify installation
56
56
57
57
To verify everything is working OK, navigate to the install directory, and launch any of the supplied FVP executables. No additional command options are needed.
title: Run LLMs locally on Raspberry Pi 5 for Edge AI
3
+
3
4
weight: 2
4
5
5
6
### FIXED, DO NOT MODIFY
@@ -8,66 +9,65 @@ layout: learningpathall
8
9
9
10
## Overview
10
11
11
-
This Learning Path walks you through deploying an efficient large language model (LLM) locally on the Raspberry Pi 5, powered by an Arm Cortex-A76 CPU. This will allow you to control your smart home using natural language, without relying on cloud services. With rapid advances in Generative AI and the power of Arm Cortex-A processors, you can now run advanced language models directly in your home on the Raspberry Pi 5.
12
+
This Learning Path walks you through deploying an efficient large language model (LLM) locally on the Raspberry Pi 5, powered by an Arm Cortex-A76 CPU. This setup enables you to control your smart home using natural language without relying on cloud services. With rapid advances in generative AI and the power of Arm Cortex-A processors, you can now run advanced language models directly in your home on the Raspberry Pi 5.
12
13
13
-
You will create a fully local, privacy-first smart home system that leverages the strengths of Arm Cortex-A architecture. The system can achieve 15+ tokens per second inference speeds using optimized models like TinyLlama and Qwen, while maintaining the energy efficiency that makes Arm processors a good fit for always-on applications.
14
+
You will create a fully local, privacy-first smart home system that leverages the strengths of Arm Cortex-A architecture. The system can achieve 15+ tokens per second inference speeds using optimized models like TinyLlama and Qwen, while maintaining the energy efficiency that makes Arm processors well suited for always-on applications.
14
15
15
-
## Why Arm Cortex-A for Edge AI?
16
+
## Why Arm Cortex-A76 makes Raspberry Pi 5 ideal for Edge AI
16
17
17
18
The Raspberry Pi 5's Arm Cortex-A76 processor can manage high-performance computing tasks like AI inference. Key architectural features include:
18
19
19
-
-The **superscalar architecture** allows the processor to execute multiple instructions in parallel, improving throughput for compute-heavy tasks.
20
-
-**128-bit NEON SIMD support** accelerates matrix and vector operations, which are common in the inner loops of language model inference.
21
-
-The **multi-level cache hierarchy** helps reduce memory latency and improves data access efficiency during runtime.
22
-
-The **thermal efficiency** enables sustained performance without active cooling, making it ideal for compact or always-on smart home setups.
20
+
-**Superscalar architecture**: Executes multiple instructions in parallel, improving throughput for compute-heavy tasks
21
+
-**128-bit NEON SIMD support**: Accelerates matrix and vector operations, common in the inner loops of language model inference
22
+
-**Multi-level cache hierarchy**: Reduces memory latency and improves data access efficiency during runtime
23
+
-**Thermal efficiency**: Enables sustained performance without active cooling, making it ideal for compact or always-on smart home setups
23
24
24
-
These characteristics make the Raspberry Pi 5 well-suited for workloads like smart home assistants, where responsiveness, efficiency, and local processing are important. Running LLMs locally on Arm-based devices brings several practical benefits. Privacy is preserved, since conversations and routines never leave the device. With optimized inference, the system can offer responsiveness under 100 ms, even on resource-constrained hardware. It remains fully functional in offline scenarios, continuing to operate when internet access is unavailable. Developers also gain flexibility to customize models and automations. Additionally, software updates and an active ecosystem continue to improve performance over time.
25
+
These characteristics make the Raspberry Pi 5 wellsuited for workloads like smart home assistants, where responsiveness, efficiency, and local processing are important. Running LLMs locally on Arm-based devices brings several practical benefits. Privacy is preserved, since conversations and routines never leave the device. With optimized inference, the system can offer responsiveness under 100 ms, even on resource-constrained hardware. It remains fully functional in offline scenarios, continuing to operate when internet access is unavailable. Developers also gain flexibility to customize models and automations. Additionally, software updates and an active ecosystem continue to improve performance over time.
25
26
26
-
## Arm Ecosystem Advantages
27
+
## Leverage the Arm ecosystem for Raspberry Pi Edge AI
27
28
28
29
For the stack in this setup, Raspberry Pi 5 benefits from the extensive developer ecosystem:
29
30
30
31
- Optimized compilers including GCC and Clang with Arm-specific enhancements
31
32
- Native libraries such as gpiozero and lgpio are optimized for Raspberry Pi
32
-
- Community support from open-source projects where developers are contributing Arm-optimized code
33
-
-Arm maintains a strong focus on backward compatibility, which reduces friction when updating kernels or deploying across multiple Arm platforms
33
+
- Community support from open-source projects where developers contribute Arm-optimized code
34
+
-Backward compatibility in Arm architecture reduces friction when updating kernels or deploying across platforms
34
35
- The same architecture powers smartphones, embedded controllers, edge devices, and cloud infrastructure—enabling consistent development practices across domains
35
36
36
-
## Performance Benchmarks on Raspberry Pi 5
37
+
## Performance benchmarks on Raspberry Pi 5
37
38
38
39
The table below shows inference performance for several quantized models running on a Raspberry Pi 5. Measurements reflect single-threaded CPU inference with typical prompt lengths and temperature settings suitable for command-based interaction.
What does this table tell us? Here are some performance insights:
50
-
51
-
- Qwen 0.5B and TinyLlama 1.1B deliver fast token generation and low average latency, making them suitable for real-time interactions like voice-controlled smart home commands.
52
-
- DeepSeek-Coder 1.3B and Gemma 2B trade off some speed for improved language understanding, which can be useful for more complex task execution or context-aware prompts.
53
-
- DeepSeek-R1 7B offers advanced reasoning capabilities with acceptable latency, which may be viable for offline summarization, planning, or low-frequency tasks.
51
+
- Qwen 0.5B and TinyLlama 1.1B deliver fast token generation and low average latency, making them suitable for real-time interactions such as voice-controlled smart home commands
52
+
- DeepSeek-Coder 1.3B and Gemma 2B trade some speed for improved language understanding, which can be useful for complex tasks or context-aware prompts
53
+
- DeepSeek-R1 7B offers advanced reasoning capabilities with acceptable latency, which may be viable for offline summarization, planning, or low-frequency tasks
54
54
55
-
## Supported Arm-Powered Devices
55
+
## Supported Arm-powered devices
56
56
57
-
This Learning Path focuses on the Raspberry Pi 5, but you can adapt the concepts and code to other Arm-powered devices:
57
+
This Learning Path focuses on the Raspberry Pi 5, but you can adapt the concepts and code to other Arm-powered devices.
58
58
59
-
###Recommended Platforms
59
+
## Recommended platforms
60
60
61
-
| Platform | CPU | RAM | GPIO Support| Model Size Suitability|
|**Raspberry Pi 5**| Arm Cortex-A76 quad-core @ 2.4GHz | Up to 16GB | Native `lgpio` (high-performance) | Large models (8–16GB) |
64
+
|**Raspberry Pi 4**| Arm Cortex-A72 quad-core @ 1.8GHz | Up to 8GB | Compatible with `gpiozero`| Small to mid-size models |
65
+
|**Other Arm devices**| Arm Cortex-A | 4GB min (8GB+ recommended) | Requires physical GPIO pins | Varies by RAM |
66
66
67
-
Additionally, the platform must:
67
+
Additionally, the platform must meet the following requirements:
68
68
69
69
- GPIO pins available for hardware control
70
-
-Use Python 3.8 or newer
70
+
- Python 3.8 or newer
71
71
- Ability to run [Ollama](https://ollama.com/)
72
72
73
-
Continue to the next section to start building a smart home system that highlights how Arm-based processors can enable efficient, responsive, and private AI applications at the edge.
73
+
In the next section, you’ll set up the software dependencies needed to start building your privacy-first smart home system on Raspberry Pi 5.
0 commit comments