You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
73
+
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
71
74
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
72
75
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
76
+
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
- Learning Path: [Build an Android chat app with Llama, KleidiAI, ExecuTorch, and XNNPACK](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/build-llama3-chat-android-app-using-executorch-and-xnnpack/)
66
67
- Learning Path: [Vision LLM Inference on Android with KleidiAI](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vision-llm-inference-on-android-with-kleidiai-and-mnn/)
Copy file name to clipboardExpand all lines: Projects/Projects/Ethos-U85-NPU-Applications.md
+16-23Lines changed: 16 additions & 23 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -42,74 +42,67 @@ This project challenges you to explore the boundaries of what’s possible on Et
42
42
43
43
**Project Summary**
44
44
45
-
Using hardware such as the Alif Ensemble E4/E6/E8 DevKits (all include Ethos-U85) or a comparable platform or Arm Fixed Virtual Platform Corstone-320, your task is to design and benchmark an advanced edge inference application that exploits the Ethos-U85’s compute and transformer capabilities.
45
+
Using hardware such as the Alif Ensemble E4/E6/E8 DevKits (all include Ethos-U85), your task is to design and benchmark an advanced edge inference application that exploits the Ethos-U85’s compute and transformer capabilities.
46
+
47
+
You can utilise the Arm Fixed Virtual Platform Corstone-320 to prototype and test your application functionally, without access to Alif hardware. You can use this to prove functional correctness - and can then later test performance on actual silicon. We are interested to see projects both in simulation, and on final hardware.
46
48
47
49
Your project should include:
48
50
49
-
1.Model Deployment and Optimization
51
+
**Model Deployment and Optimization**
50
52
Select a computationally intensive model — ideally transformer-based or multi-branch convolutional — and deploy it on the Ethos-U85 using:
51
-
-The TOSA Model Explorer extension to inspect and adapt unsupported or experimental models for TOSA compliance.
53
+
- Model Explorer to inspect models and identify problem layers that reduce optimal delegation to the Ethos-U backend
52
54
- The Vela compiler for optimization.
53
55
54
56
These tools can be used to:
55
57
- Convert and visualize model graphs in TOSA format.
56
58
- Identify unsupported operators.
57
-
- Modify or substitute layers for compatibility using the Flatbuffers schema before re-exporting.
58
-
- Run Vela for optimized compilation targeting Ethos-U85.
59
59
60
-
2.Application Demonstration
60
+
**Application Demonstration**
61
61
Implement a working example that highlights the Ethos-U85’s strengths in real-world inference. Possible categories include:
62
62
- Transformers on Edge: lightweight BERT, ViT, or audio transformers (e.g. speech or sound event classification).
63
63
- High-resolution Vision: semantic segmentation, object detection on large input sizes, or multi-head perception networks.
64
64
- Multi-modal Fusion: combining audio, image, or sensor streams for contextual understanding.
65
65
66
-
3.Analysis and Benchmarking
66
+
**Analysis and Benchmarking**
67
67
Report quantitative results on:
68
68
- Inference latency, throughput (FPS or tokens/s), and memory footprint.
69
69
- Power efficiency under load (optional).
70
70
- Comparative performance versus Ethos-U55/U65 (use available benchmarks for reference or utilise the other Ethos-U NPUs provided in the Alif DevKits).
71
-
- The effect of TOSA optimization — demonstrate measurable improvements from graph conversion and operator fusion.
72
-
73
-
---
74
71
75
72
## What kind of projects should you target?
76
73
77
74
To clearly demonstrate the leap from Ethos-U55/U65 to U85, choose projects that meet at least one of the following criteria:
78
75
79
76
- Transformer-heavy architectures: e.g. attention blocks, transformer encoders, ViTs, or hybrid CNN+transformer models.
80
-
-*Example:* an audio event detection transformer that must process longer sequences or higher-resolution spectrograms.
81
-
- High-resolution or multi-branch networks: models with high input dimensionality or multiple processing paths that saturate NPU throughput.
82
-
-*Example:* 512×512 semantic segmentation or multi-object detection.
77
+
- High-resolution or multi-branch networks: models with high input dimensionality or multiple processing paths that saturate NPU throughput.
83
78
- Dense post-processing or large fully connected layers: cases where U55/U65 memory limits or MAC bandwidth previously restricted performance.
84
-
-*Example:* large MLP heads or transformer token mixers.
85
79
- Multi-modal pipelines: combining multiple sensor inputs (e.g. image + IMU + audio) where the NPU must maintain concurrency or shared intermediate representations.
86
80
87
81
The Ethos-U85 is ideal for projects where model performance is constrained by attention layers, large activations, or operator types that previously required fallback to the CPU. Use the Ethos-U85 to eliminate those fallbacks and achieve full-NPU execution of advanced topologies.
88
82
89
-
---
90
-
91
83
## What will you use?
92
84
You should be familiar with, or willing to learn about:
93
85
- Programming: Python, C/C++
94
-
- ExecuTorch or TensorFlow Lite (Micro/LiteRT)
86
+
- ExecuTorch or LiteRT
95
87
- Techniques for optimising AI models for the edge (quantization, pruning, etc.)
96
88
- Optimization Tools:
97
-
- TOSA Model Explorer
98
-
- .tflite to .tosa converter (if using Tensorflow rather than ExecuTorch)
89
+
- Model Explorer with TOSA adapter (and PTE adapter for ExecuTorch)
99
90
- Vela compiler for Ethos-U
100
91
- Bare-metal or RTOS (e.g., Zephyr)
101
92
102
-
---
103
-
104
93
## Resources from Arm and our partners
105
94
- Arm Developer: [Edge AI](https://developer.arm.com/edge-ai)
106
-
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
95
+
- Arm Alif Code-along: [Advanced AI on Arm Embedded Systems](https://developer.arm.com/code-along/advanced-ai-on-arm-embedded-systems)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
98
+
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
107
99
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
108
100
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
109
101
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
Copy file name to clipboardExpand all lines: Projects/Projects/Game-Dev-Using-Neural-Graphics-&-Unreal-Engine.md
+8-6Lines changed: 8 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: "Game development using Arm Neural Graphics with Unreal Engine"
3
-
description: "Build a playable Unreal Engine 5 game demo that utilises Arm’s Neural Graphics SDK UE plugin for features such as Neural Super Sampling (NSS). Showcase near-identical image quality at lower resolution by driving neural rendering directly in the graphics pipeline."
3
+
description: "Build a playable Unreal Engine 5 game demo that utilises Arm’s Neural Graphics SDK UE plugin for features such as Neural Super Sampling (NSS). Showcase improved graphical fidelity at lower resolution by driving neural rendering directly in the graphics pipeline."
Document your progress and findings and consider alternative applications of the neural technology within games development.
53
53
@@ -60,7 +60,9 @@ Attempt different environments and objects. For example:
60
60
61
61
Make your scenes dynamic with particle effects, shadows, physics and motion.
62
62
63
-
---
63
+
**Beyond the plugin**
64
+
65
+
Want to go further and start experimenting more with Neural Graphics? After building your game with the NSS Unreal plugin, try-out the [Vulkan ML Extensions learning path](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/) to explore how neural inference runs directly through the Vulkan API. This provides lower-level control over ML workloads in the graphics pipeline, and allows for prototyping custom neural effects or optimising performance beyond what’s exposed through the engine plugin. You may also want to explore [fine-tuning your own neural models with the Arm Neural Graphics Model Gym](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/model-training-gym/) and how to [apply different quantization strategies](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/quantize-neural-upscaling-models/) for optimisation of memory and latency.
64
66
65
67
## Pre-requisites
66
68
- Laptop/PC/Mobile for Android Unreal Engine game development
@@ -72,11 +74,11 @@ Make your scenes dynamic with particle effects, shadows, physics and motion.
72
74
- Get Started Blog: [Start experimenting with NSS today](https://developer.arm.com/community/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-to-access-arm-neural-super-sampling)
73
75
- Deep Dive Blog: [How NSS works](https://developer.arm.com/community/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-arm-neural-super-sampling-works)
74
76
- Arm Developer: [Neural Graphics Development Kit](https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics)
75
-
- Learning Path: [Fine-tuning neural graphics models with Model Gym](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/model-training-gym/)
76
77
- Learning Path: [Neural Super Sampling in Unreal Engine](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/nss-unreal/)
77
78
- Learning Path: [Getting started with Arm Accuracy Super Resolution (Arm ASR)](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/get-started-with-arm-asr/)
78
79
- Unreal Engine Intro by Epic Games: [Understanding the basics](https://dev.epicgames.com/documentation/en-us/unreal-engine/understanding-the-basics-of-unreal-engine)
- Repo: [Arm Neural Geraphics for Unreal](https://github.com/arm/neural-graphics-for-unreal)
80
82
- Repo: [Arm Neural Graphics Model Gym](https://github.com/arm/neural-graphics-model-gym)
81
83
- Documentation: [Arm Neural Graphics SDK for Game Engines Developer guide](https://developer.arm.com/documentation/111167/latest/)
82
84
@@ -88,7 +90,7 @@ This project is designed to be self-serve but comes with opportunity of some com
88
90
89
91
## Benefits
90
92
91
-
Standout project contributions to the community will earn digital badges. These badges can support CV or resumé building and demonstrate earned recognition.
93
+
Standout project contributions to the community will earn digital badges. These badges can support CV or resumé building and demonstrate earned recognition. Contributions may also be highlighted in case studies or newsletters.
92
94
93
95
94
96
To receive the benefits, you must show us your project through our [online form](https://forms.office.com/e/VZnJQLeRhD). Please do not include any confidential information in your contribution. Additionally if you are affiliated with an academic institution, please ensure you have the right to share your material.
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
75
+
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
73
76
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
74
77
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
78
+
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
136
+
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
128
137
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
129
138
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
139
+
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
- Learning Path: [Build an Android chat app with Llama, KleidiAI, ExecuTorch, and XNNPACK](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/build-llama3-chat-android-app-using-executorch-and-xnnpack/)
68
69
- Learning Path: [Vision LLM Inference on Android with KleidiAI](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vision-llm-inference-on-android-with-kleidiai-and-mnn/)
@@ -121,6 +122,7 @@ Utilise the resources and learning paths below and create an exciting and challe
121
122
## Resources from Arm and our partners
122
123
123
124
- Arm Developer: [Launchpad - Mobile AI](https://developer.arm.com/mobile-graphics-and-gaming/ai-mobile)
125
+
- Learning Path: [Profile ExecuTorch models with SME2 on Arm](https://learn.arm.com/learning-paths/cross-platform/sme-executorch-profiling/)
- Learning Path: [Build an Android chat app with Llama, KleidiAI, ExecuTorch, and XNNPACK](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/build-llama3-chat-android-app-using-executorch-and-xnnpack/)
126
128
- Learning Path: [Vision LLM Inference on Android with KleidiAI](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vision-llm-inference-on-android-with-kleidiai-and-mnn/)
0 commit comments