|
1 | 1 | --- |
2 | | -title: Install Model Gym and Explore Neural Graphics Examples |
| 2 | +title: Install Model Gym and explore neural graphics examples |
3 | 3 | weight: 2 |
4 | 4 |
|
5 | 5 | ### FIXED, DO NOT MODIFY |
6 | 6 | layout: learningpathall |
7 | 7 | --- |
8 | 8 |
|
9 | | -## What is Neural Graphics? |
| 9 | +## What is neural graphics? |
10 | 10 |
|
11 | | -Neural graphics is an intersection of graphics and machine learning. Rather than relying purely on traditional GPU pipelines, neural graphics integrates learned models directly into the rendering stack. The techniques are particularly powerful on mobile devices, where battery life and performance constraints limit traditional compute-heavy rendering approaches. The goal is to deliver high visual fidelity without increasing GPU cost. This is achieved by training and deploying compact neural networks optimized for the device's hardware. |
| 11 | +Neural graphics is an intersection of graphics and machine learning. Rather than relying purely on traditional GPU pipelines, neural graphics integrates learned models directly into the rendering stack. These techniques are particularly powerful on mobile devices, where battery life and performance constraints limit traditional compute-heavy rendering approaches. Your goal is to deliver high visual fidelity without increasing GPU cost. You achieve this by training and deploying compact neural networks optimized for your device's hardware. |
12 | 12 |
|
13 | 13 | ## How does Arm support neural graphics? |
14 | 14 |
|
15 | | -Arm enables neural graphics through the [**Neural Graphics Development Kit**](https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics): a set of open-source tools that let developers train, evaluate, and deploy ML models for graphics workloads. |
| 15 | + |
| 16 | +Arm enables neural graphics through the [**Neural Graphics Development Kit**](https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics): a set of open-source tools that let you train, evaluate, and deploy ML models for graphics workloads. |
| 17 | + |
16 | 18 |
|
17 | 19 | At its core are the ML Extensions for Vulkan, which bring native ML inference into the GPU pipeline using structured compute graphs. These extensions (`VK_ARM_tensors` and `VK_ARM_data_graph`) allow real-time upscaling and similar effects to run efficiently alongside rendering tasks. |
18 | 20 |
|
19 | | -The neural graphics models can be developed using well-known ML frameworks like PyTorch, and exported to deployment using Arm's hardware-aware pipeline. The workflow converts the model to `.vgf` via the TOSA intermediate representation, making it possible to do tailored model development for you game use-case. This Learning Path focuses on **Neural Super Sampling (NSS)** as the use case for training, evaluating, and deploying neural models using a toolkit called the [**Neural Graphics Model Gym**](https://github.com/arm/neural-graphics-model-gym). To learn more about NSS, you can check out the [resources on Hugging Face](https://huggingface.co/Arm/neural-super-sampling). Additonally, Arm has developed a set of Vulkan Samples to get started. Specifically, `.vgf` format is introduced in the `postprocessing_with_vgf` one. The Vulkan Samples and over-all developer resources for neural graphics is covered in the [introductory Learning Path](/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample). |
20 | 21 |
|
21 | | -Starting in 2026, Arm GPUs will feature dedicated neural accelerators, optimized for low-latency inference in graphics workloads. To help developers get started early, Arm provides the ML Emulation Layers for Vulkan that simulate future hardware behavior, so you can build and test models now. |
| 22 | + |
| 23 | +You can develop neural graphics models using well-known ML frameworks like PyTorch, then export them for deployment with Arm's hardware-aware pipeline. The workflow converts your model to `.vgf` using the TOSA intermediate representation, making it possible to tailor model development for your game use case. In this Learning Path, you will focus on **Neural Super Sampling (NSS)** as the primary example for training, evaluating, and deploying neural models using the [**Neural Graphics Model Gym**](https://github.com/arm/neural-graphics-model-gym). To learn more about NSS, see the [resources on Hugging Face](https://huggingface.co/Arm/neural-super-sampling). Arm has also developed a set of Vulkan Samples to help you get started. The `.vgf` format is introduced in the `postprocessing_with_vgf` sample. For a broader overview of neural graphics developer resources, including the Vulkan Samples, see the introductory Learning Path [Get started with neural graphics using ML Extensions for Vulkan](/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/). |
| 24 | + |
| 25 | + |
| 26 | + |
| 27 | +Starting in 2026, Arm GPUs will feature dedicated neural accelerators, optimized for low-latency inference in graphics workloads. To help you get started early, Arm provides the ML Emulation Layers for Vulkan that simulate future hardware behavior, so you can build and test models now. |
22 | 28 |
|
23 | 29 | ## What is the Neural Graphics Model Gym? |
24 | 30 |
|
| 31 | + |
25 | 32 | The Neural Graphics Model Gym is an open-source toolkit for fine-tuning and exporting neural graphics models. It is designed to streamline the entire model lifecycle for graphics-focused use cases, like NSS. |
26 | 33 |
|
27 | | -Model Gym gives you: |
| 34 | +With Model Gym, you can: |
| 35 | + |
| 36 | +- Train and evaluate models using a PyTorch-based API |
| 37 | +- Export models to `.vgf` using ExecuTorch for real-time use in game development |
| 38 | +- Take advantage of quantization-aware training (QAT) and post-training quantization (PTQ) with ExecuTorch |
| 39 | +- Use an optional Docker setup for reproducibility |
| 40 | + |
| 41 | +You can choose to work with Python notebooks for rapid experimentation or use the command-line interface for automation. This Learning Path will walk you through the demonstrative notebooks and prepare you to start using the CLI for your own model development. |
28 | 42 |
|
29 | | -- A training and evaluation API built on PyTorch |
30 | | -- Model export to .vgf using ExecuTorch for real-time use in game development |
31 | | -- Support for quantization-aware training (QAT) and post-training quantization (PTQ) using ExecuTorch |
32 | | -- Optional Docker setup for reproducibility |
33 | 43 |
|
34 | | -The toolkit supports workflows via both Python notebooks (for rapid experimentation) and command-line interface. This Learning Path will walk you through the demonstrative notebooks, and prepare you to start using the CLI for your own model development. |
| 44 | +You're now ready to set up your environment and start working with neural graphics models. Keep going! |
0 commit comments