diff --git a/content/install-guides/ams.md b/content/install-guides/ams.md index 2c98b5fc73..f4cff56adc 100644 --- a/content/install-guides/ams.md +++ b/content/install-guides/ams.md @@ -37,96 +37,77 @@ test_maintenance: true test_images: - ubuntu:latest --- -[Arm Performance Studio](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio) is a performance analysis tool suite for Android and Linux application developers. +[Arm Performance Studio](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio) is a performance analysis tool suite for Android and Linux application developers -It helps analyze how your game or app performs on production devices, so you can identify issues that affect performance, cause overheating, or drain battery life. - -The following table lists the tools and describes their functions: +It comprises of a suite of easy-to-use tools that show you how well your game or app performs on production devices, so that you can identify problems that might cause slow performance, overheat the device, or drain the battery. | Component | Functionality | |----------|-------------| -| [Streamline](https://developer.arm.com/Tools%20and%20Software/Streamline%20Performance%20Analyzer) | Captures a performance profile with hardware counter activity from the device. | -| [Performance Advisor](https://developer.arm.com/Tools%20and%20Software/Performance%20Advisor) | Generates an easy-to-read performance summary from an annotated Streamline capture, and provides actionable optimization suggestions. | -| [Frame Advisor](https://developer.arm.com/Tools%20and%20Software/Frame%20Advisor) | Captures API calls and rendering details from a specific frame and provides detailed geometry metrics to help identify rendering bottlenecks. | -| [Mali Offline Compiler](https://developer.arm.com/Tools%20and%20Software/Mali%20Offline%20Compiler) | Analyzes how efficiently your shader programs perform on a range of Mali GPUs. | -| [RenderDoc for Arm GPUs](https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs) | Debugs Vulkan graphics applications with support for Arm GPU extensions and Android features. | +| [Streamline](https://developer.arm.com/Tools%20and%20Software/Streamline%20Performance%20Analyzer) with [Performance Advisor](https://developer.arm.com/Tools%20and%20Software/Performance%20Advisor)| Capture a performance profile that shows all the performance counter activity from the device. Generate an easy-to-read performance summary from an annotated Streamline capture, and get actionable advice about where you should optimize. | +| [Frame Advisor](https://developer.arm.com/Tools%20and%20Software/Frame%20Advisor) | Capture the API calls and rendering from a problem frame and get comprehensive geometry metrics to discover what might be slowing down your application. | +| [Mali Offline Compiler](https://developer.arm.com/Tools%20and%20Software/Mali%20Offline%20Compiler) | Analyze how efficiently your shader programs perform on a range of Mali GPUs. | +| [RenderDoc for Arm GPUs](https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs) | The industry-standard tool for debugging Vulkan graphics applications, including early support for Arm GPU extensions and Android features. | -All features of Arm Performance Studio are available free of charge without a license. +All features of Arm Performance Studio are available free of charge without any additional license. -## How do I install Arm Performance Studio? +## Download Arm Performance Studio Arm Performance Studio is supported on Windows, Linux, and macOS hosts. Download the appropriate installer from [Arm Performance Studio Downloads](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio#Downloads). Full details about the supported OS and Android versions are given in the Arm Performance Studio [Release Notes](https://developer.arm.com/documentation/107649). -### How do I install Arm Performance Studio on Windows? - -Run the downloaded `Arm_Performance_Studio__windows_x86-64.exe` installer, and follow the on-screen instructions. - -To open Streamline, Frame Advisor, or RenderDoc for Arm GPUs, go to the Windows **Start** menu and search for the name of the tool you want to open. - -Performance Advisor is a feature of the Streamline command-line application. To generate a performance report, you must first run the provided Python script to enable Streamline to collect frame data from the device. [Get started with Performance Advisor tutorial](https://developer.arm.com/documentation/102478/latest) describes this process in detail. After you have captured a profile with Streamline, run `Streamline-cli` on the Streamline capture file. This command is added to your `PATH` environment variable during installation, so it can be used from anywhere. +## How do I install Arm Performance Studio on Windows -```console -Streamline-cli.exe -pa my_capture.apc -``` - -To run Mali Offline Compiler, open a command terminal, navigate to your work directory, and run the `malioc` command on a shader program. The malioc command is added to your `PATH` environment variable during installation, so it can be used from anywhere. +Arm Performance Studio is provided as an installer executable. Double-click the `.exe` file and follow the instructions in the setup wizard. -```console -malioc.exe my_shader.frag -``` +Open the Performance Studio Hub from the **Windows Start** menu, or by double-clicking the shortcut in the installation directory. You can read a description of the tools and launch them from the Hub. -### How do I install Arm Performance Studio on macOS? +### How do I install Arm Performance Studio on macOS Arm Performance Studio is provided as a `.dmg` package. To mount it, double-click the `.dmg` package and follow the instructions. The Arm Performance Studio directory tree is copied to the Applications directory on your local file system for easy access. -You can remove write permission from the installation directory to prevent other users from writing to it. This is done with the `chmod` command. For example: +Arm recommends that you set the permissions for the installation directory to prevent other users from writing to it. This is typically achieved with the `chmod` command. For example, ``` chmod go-w ``` -Open Streamline, Frame Advisor or RenderDoc for Arm GPUs directly from the Arm Performance Studio directory in your Applications directory. For example, to open Streamline, go to the `/streamline` directory and open the `Streamline.app` file. +To get started, navigate to the Arm Performance Studio installation directory in your `Applications` directory. Open the `Performance Studio.app` file to launch the **Arm Performance Studio Hub**. You can read a description of the tools and launch them from the Hub. -To run Performance Advisor, go to the `/streamline` directory, and double-click the `Streamline-cli-launcher` file. Your computer will ask you to allow Streamline to control the Terminal application. Allow this. The Performance Advisor launcher opens the Terminal application and updates your `PATH` environment variable so you can run Performance Advisor from any directory. +### Install Arm Performance Studio on Linux -Performance Advisor is a feature of the Streamline command-line application. To generate a performance report, you must first run the provided Python script to enable Streamline to collect frame data from the device. This process is described in detail in the [Get started with Performance Advisor tutorial](https://developer.arm.com/documentation/102478/latest). After you have captured a profile with Streamline, run the `Streamline-cli` command on the Streamline capture file to generate a performance report: +Arm Performance Studio is provided as a gzipped tar archive. Extract this tar archive to your preferred location, using a recent version (1.13 or later) of GNU tar: ``` -Streamline-cli -pa my_capture.apc +tar xvzf Arm_Performance_Studio_2025.1_linux.tgz ``` -To run Mali Offline Compiler, go to the `/mali_offline_compiler` directory, and double-click the `mali_offline_compiler_launcher` file. The Mali Offline Compiler launcher opens the Terminal application and updates your `PATH` environment variable so you can run the `malioc` command from any directory. To generate a shader analysis report, run the `malioc` command on a shader program: +Arm recommends that you set the permissions for the installation directory to prevent other users from writing to it. This is typically achieved with the `chmod` command. For example: ``` -malioc my_shader.frag +chmod go-w ``` -On some versions of macOS, you might see a message that Mali Offline Compiler is not recognized as an application from an identified developer. To enable Mali Offline Compiler, cancel this message, then open **System Preferences > Security & Privacy** and select **Allow Anyway** for the `malioc` application. - -### How do I install Arm Performance Studio on Linux? - -Arm Performance Studio is provided as a gzipped tar archive. Extract this tar archive to your preferred location, using version 1.13 or later of GNU tar: +{{% notice Tip %}} +You might find it useful to edit your `PATH` environment variable to add the paths to the `Streamline-cli` and `malioc` executables so that you can run them from any directory. Add the following commands to the .bashrc file in your home directory, so that they are set whenever you initialize a shell session: ``` -tar xvzf Arm_Performance_Studio__linux.tgz +PATH=$PATH://streamline +PATH=$PATH://mali_offline_compiler ``` -You can remove write permission from the installation directory to prevent other users from writing to it. This is done with the `chmod` command. For example: +{{% /notice %}} -``` -chmod go-w -``` +Inside the installation directory is a shortcut called **Performance Studio**. Double-click on the shortcut to launch the Performance Studio Hub. You can read a description of the tools and launch them from the Hub. -You might find it useful to edit your `PATH` environment variable to add the paths to the `Streamline-cli` and `malioc` executables so that you can run them from any directory. Add the following commands to the .bashrc file in your home directory, so that they are set whenever you initialize a shell session: +Alternatively, to open Streamline, Frame Advisor or RenderDoc for Arm GPUs directly, go to the installation directory, open the folder for the tool you want to open and run the application file. For example: ``` -PATH=$PATH:/streamline -PATH=$PATH:/mali_offline_compiler +cd /streamline +./Streamline ``` ## How do I get started with Arm Performance Studio? -Refer to [Get started with Arm Performance Studio](/learning-paths/mobile-graphics-and-gaming/ams/) for an overview of how to run each tool in Arm Performance Studio. +See the [Get started with Arm Performance Studio](/learning-paths/mobile-graphics-and-gaming/ams/) learning path for an overview of how to run each tool in Arm Performance Studio. diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_images/chromeos-shell.png b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_images/chromeos-shell.png new file mode 100644 index 0000000000..0611fafa46 Binary files /dev/null and b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_images/chromeos-shell.png differ diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_images/shared-folders.png b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_images/shared-folders.png new file mode 100644 index 0000000000..85bf90ce40 Binary files /dev/null and b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_images/shared-folders.png differ diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_index.md b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_index.md new file mode 100644 index 0000000000..1ce1b789e7 --- /dev/null +++ b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_index.md @@ -0,0 +1,48 @@ +--- +title: Install Ubuntu on ChromeOS Crostini as an LXC Container + +draft: true +cascade: + draft: true + +minutes_to_complete: 60 + +who_is_this_for: This Learning Path is for software developers who want to install Ubuntu and other Linux distributions on their Arm-based Chromebook with ChromeOS file sharing and GUI support. + +learning_objectives: + - Create an Ubuntu 24.04 container on ChromeOS Crostini using the Termina shell and LXC. + - Set up ChromeOS integration for file sharing and GUI applications. + - Manage LXC containers on ChromeOS. + +prerequisites: + - A ChromeOS device with the Linux development environment enabled. The Lenovo Chromebook Plus 14 is recommended. + - Basic knowledge of the Linux command line. + +author: Jason Andrews + +### Tags +skilllevels: Introductory +subjects: Containers and Virtualization +armips: + - Cortex-A +operatingsystems: + - ChromeOS +tools_software_languages: + - Ubuntu + +further_reading: + - resource: + title: Official ChromeOS Linux Support + link: https://chromeos.dev/en/linux + type: documentation + - resource: + title: Linux Containers + link: https://linuxcontainers.org/ + type: website + +### FIXED, DO NOT MODIFY +# ================================================================================ +weight: 1 # _index.md always has weight of 1 to order correctly +layout: "learningpathall" # All files under learning paths have this same wrapper +learning_path_main_page: "yes" # This should be surfaced when looking for related content. Only set for _index.md of learning path content. +--- diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_next-steps.md b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_next-steps.md new file mode 100644 index 0000000000..825303482f --- /dev/null +++ b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/_next-steps.md @@ -0,0 +1,6 @@ +--- +weight: 21 +title: "Next Steps" +layout: "learningpathall" +--- + diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/chromeos-shell.png b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/chromeos-shell.png new file mode 100644 index 0000000000..0611fafa46 Binary files /dev/null and b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/chromeos-shell.png differ diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-1.md b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-1.md new file mode 100644 index 0000000000..432390867a --- /dev/null +++ b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-1.md @@ -0,0 +1,132 @@ +--- +title: "Create an Ubuntu container" +weight: 2 +layout: "learningpathall" +--- + +The [Lenovo Chromebook Plus 14](https://www.bestbuy.com/site/lenovo-chromebook-plus-14-oled-2k-touchscreen-laptop-mediatek-kompanio-ultra-16gb-memory-256gb-ufs-seashell/6630493.p?skuId=6630493&intl=nosplash) is is powered by the MediaTek Kompanio Ultra processor, featuring an Arm-based MediaTek Kompanio Ultra processor, offers software developers a powerful and energy-efficient platform for Linux development. Its compatibility with containerized environments and support for ChromeOS Linux (Crostini) make it an excellent choice for coding, testing, and running modern development workflows on the go. + +This Learning Path will walk you through setting up an Ubuntu 24.04 container on your Arm-based Chromebook using ChromeOS's built-in Linux development environment. You'll learn how to create and manage containers, install essential development tools, and integrate your Ubuntu environment with ChromeOS features like file sharing and GUI application support. By the end, you'll have a flexible and powerful Arm Linux development environment. + +## Access the ChromeOS terminal + +The first step to creating an Ubuntu container on ChromeOS is to open the ChromeOS shell. + +Open the Chrome browser and press **Ctrl + Alt + T** to open crosh, the ChromeOS shell, crosh. + +![ChromeOS shell #center](_images/chromeos-shell.png) + +Run the command below to start the Termina shell. + +```console +vsh termina +``` + +You are now in the Termina environment where you can manage containers. + +The `lxc` command is used to manage containers on ChromeOS. + +You can list the running containers. + +```console +lxc list +``` + +If you have the default Debian container running you see output similar to: + +```output ++---------+---------+-----------------------+------+-----------+-----------+ +| NAME | STATE | IPV4 | IPV6 | TYPE | SNAPSHOTS | ++---------+---------+-----------------------+------+-----------+-----------+ +| penguin | RUNNING | 100.115.92.204 (eth0) | | CONTAINER | 0 | ++---------+---------+-----------------------+------+-----------+-----------+ +``` + +The name of the Debian container is penguin. When you enable the Linux subsystem on ChromeOS the Debian container named penguin is created, but you can create additional containers with different Linux distributions and different names. + +## Create a Ubuntu 24.04 container + +This command creates and starts a new Ubuntu 24.04 container named `u1`. + +```bash +lxc launch ubuntu:24.04 u1 +``` + +The output is: + +```output +Creating u1 +Starting u1 +``` + +Check the status of the new container and confirm the status is RUNNING. + +```bash +lxc list +``` + +Now there are 2 containers running. + +```output ++---------+---------+-----------------------+------+-----------+-----------+ +| NAME | STATE | IPV4 | IPV6 | TYPE | SNAPSHOTS | ++---------+---------+-----------------------+------+-----------+-----------+ +| penguin | RUNNING | 100.115.92.204 (eth0) | | CONTAINER | 0 | ++---------+---------+-----------------------+------+-----------+-----------+ +| u1 | RUNNING | 100.115.92.206 (eth0) | | CONTAINER | 0 | ++---------+---------+-----------------------+------+-----------+-----------+ +``` + +Create a new shell in the Ubuntu container. + +```bash +lxc exec u1 -- bash +``` + +## Complete the Ubuntu setup + +Once inside the Ubuntu container, you need to perform some initial setup tasks. + +Update the package lists and upgrade installed packages to the latest versions. + +```bash +apt update && apt upgrade -y +``` + +Install essential packages for development and system management. You can select your favorite software packages, these are examples. + +```bash +apt install -y net-tools gcc +``` + +Creating a non-root user is a crucial security best practice and ensures that applications don't have unnecessary administrative privileges. The username `ubuntu` is already available in the container. You are free to use `ubuntu` as your non-root user or create a new user. + +{{% notice Note %}} +The following commands use `user1` as a new username. You can replace it with your actual desired username in all subsequent steps. +{{% /notice %}} + +Create a new user account. Skip if you want to use the `ubuntu` user. + +```bash +adduser user1 +``` + +Add the new user to the sudo group to grant administrative privileges. Skip if you want to use the `ubuntu` user. + +```bash +usermod -aG sudo user1 +``` + +Switch to your new user account to continue the setup. + +```bash +su - user1 +``` + +If you didn't creat a new user switch to `ubuntu` as the non-root user. + +```bash +su - ubuntu +``` + +Continue to learn how to integrate the new Ubuntu container with ChromeOS features like file sharing. \ No newline at end of file diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-2.md b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-2.md new file mode 100644 index 0000000000..5487c9a61f --- /dev/null +++ b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-2.md @@ -0,0 +1,97 @@ +--- +title: ChromeOS integration +weight: 3 +layout: "learningpathall" +--- + +## File sharing between ChromeOS and Linux + +Chromebooks with Linux offer convenient file sharing capabilities between the main ChromeOS environment and the Linux subsystem. + +Key Features: + +- Selective Folder Sharing: ChromeOS allows you to share specific folders (not individual files) from the native files app with the Linux container. This is done by right-clicking a folder and selecting "Share with Linux." Once shared, these directories become accessible to Linux apps and the command line within the Linux environment. + +- Two-Way Access: Files and folders created within the Linux container appear in the "Linux files" section of the ChromeOS Files app, enabling seamless movement of data between environments. + +- Sandboxed Security: The Linux environment is sandboxed for security, meaning it doesn't have access to the full ChromeOS file system by default. Only the folders explicitly shared by the you are visible in Linux, ensuring protected data separation. + +- Easy Integration: Shared folders can be navigated from Linux at paths such as /mnt/chromeos/MyFiles/. Applications and command-line tools within Linux can read and write to these shared folders. + +- Management Tools: You can manage and revoke shared folder access through the ChromeOS Files app, allowing for flexible control over what is accessible to Linux. + +These features make it simple to move files between ChromeOS and Linux applications while maintaining security and user control. + +## Configure File System Integration + +### Share ChromeOS directories + +To access your ChromeOS files from within the Ubuntu container, you need to configure shared directories. + +You can share directories using the ChromeOS File application. Right click on any directory and select **Share with Linux**. + +If you share a ChromeOS directory it appears in `/mnt/chromeos/MyFiles/` in your Ubuntu container. For example, share your Downloads directory in ChromeOS and it is visible in Ubuntu. + +```bash +ls /mnt/chromeos/MyFiles/Downloads/ +``` + +### Share Google Drive directories + +You can also share Google Drive directories using the ChromeOS Files application. Use the same right click and select **Share with Linux**. + +If you share a Google Drive folder it appears in `/mnt/chromeos/GoogleDrive/MyDrive/` in your Ubuntu container. For example, share your `AndroidAssets` directory in Google Drive it is visible in Ubuntu. + +```bash +ls /mnt/chromeos/GoogleDrive/MyDrive/AndroidAssets +``` + +Your shared folders appear in the **Linux Settings** under **Manage shared folders** as shown below: + +![Shared folders #center](_images/shared-folders.png) + +### Share a folder using the command line + +You can use the commands below from the Termina shell. + +Mount the entire ChromeOS file system to /mnt/chromeos in the container. + +```bash +lxc config device add ubuntu-main shared-chromeos disk source=/mnt/chromeos path=/mnt/chromeos +``` + +Share your ChromeOS Downloads folder with the container + +```bash +lxc config device add ubuntu-main downloads disk source=/mnt/chromeos/MyFiles/Downloads path=/home/username/Downloads +``` + +Share your ChromeOS Documents folder with the container. + +```bash +lxc config device add ubuntu-main documents disk source=/mnt/chromeos/MyFiles/Documents path=/home/username/Documents +``` + +## File Operations + +You can use the `lxc file` command to copy files to and from a container from the Termina shell. + +As an example create a file named file1 + +```bash +echo "test file 1" >> /mnt/shared/MyFiles/Downloads/file1 +``` + +Copy the file from your ChromeOS Downloads folder to the tmp directory in the container. + +```bash +lxc file push file1 u1/tmp/file1 +``` + +Copy the same file back to the Downloads directory with a new name. + +```bash +lxc file pull u1/tmp/file1 file2 +``` + +You now have the file in your Downloads directory with a new name. diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-3.md b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-3.md new file mode 100644 index 0000000000..7b1bce81a9 --- /dev/null +++ b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-3.md @@ -0,0 +1,96 @@ +--- +title: Enable desktop applications +weight: 4 +layout: "learningpathall" +--- + +To use desktop applications like browsers in the Ubuntu container you need to enable the connection to the ChromeOS desktop using Sommelier. + +Sommelier acts as a bridge, enabling seamless integration and smooth operation of Linux apps within the ChromeOS environment. + +## Enable GUI Application Support + +Install a minimal desktop environment to provide the necessary libraries for graphical applications. + +```bash +sudo apt install -y xubuntu-desktop-minimal +``` + +Install a test application. + +```bash +sudo apt install -y terminator +``` + +Configure the display environment variables to ensure GUI applications know where to render their windows + +```console +echo 'export DISPLAY=:0' >> ~/.bashrc +``` + +Install the necessary tools to build Sommelier. + +```bash +sudo apt install -y clang meson libwayland-dev cmake pkg-config libgbm-dev libdrm-dev libxpm-dev libpixman-1-dev libx11-xcb-dev libxcb-composite0-dev libxkbcommon-dev libgtest-dev python3-jinja2 +``` + +You need to build Sommelier from source code because it is not available in Ubuntu repositories. + +```bash +git clone https://chromium.googlesource.com/chromiumos/platform2 +cd platform2/vm_tools/sommelier +meson build +cd build +ninja +sudo ninja install +``` + +Sommelier is now installed in `/usr/local/bin/` + +Create a systemd user unit file for X11 support. + +```bash +mkdir -p ~/.config/systemd/user +``` + +Use a text editor to create the file `~/.config/systemd/user/sommelier@.service` with the contents below. + +```ini +[Unit] +Description=Sommelier X11 bridge instance %i + +[Service] +Environment=DISPLAY=:0 +ExecStart=/usr/local/bin/sommelier -X --scale=1 --no-exit-with-child -- /bin/true +Restart=on-failure + +[Install] +WantedBy=default.target +``` + +Reload the Systemd user manager and start the Sommelier service. + +```bash +systemctl --user daemon-reload +systemctl --user enable --now sommelier@0.service +``` + +Confirm the Sommelier service is running. + +```bash +systemctl --user status sommelier@0.service +``` + +Test a graphical application works. You can pick other applications to try. + +```bash +terminator & +``` + +You should see a new terminal open on your ChromeOS desktop. + +If needed, you can restart Sommelier. + +```bash +sudo systemctl restart sommelier@0 +``` \ No newline at end of file diff --git a/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-4.md b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-4.md new file mode 100644 index 0000000000..e3b6e07f2b --- /dev/null +++ b/content/learning-paths/laptops-and-desktops/chrome-os-lxc/page-4.md @@ -0,0 +1,183 @@ +--- +title: Learn more Linux Container commands +weight: 5 +layout: "learningpathall" +--- + +## Container Management + +Now that you have the basics, here are some useful commands for managing your container from the Termina shell. + +Start a stopped container. + +```bash +lxc start u1 +``` + +Stop a running container. + +```bash +lxc stop u1 +``` + +Enter the container's shell. + +```bash +lxc exec u1 -- bash +``` + +List all available containers and their status. + +```bash +lxc list +``` + +Delete a container. This is a permanent action. + +```bash +lxc delete u1 +``` + +Print additional container information. + +```bash +lxc info u1 +``` + +The output is similar to: + +```output +Name: u1 +Status: RUNNING +Type: container +Architecture: aarch64 +PID: 24141 +Created: 2025/08/07 04:46 EDT +Last Used: 2025/08/07 04:46 EDT + +Resources: + Processes: 120 + CPU usage: + CPU usage (in seconds): 384 + Memory usage: + Memory (current): 1.58GiB + Memory (peak): 4.86GiB + Network usage: + eth0: + Type: broadcast + State: UP + Host interface: veth7df9a2e6 + MAC address: 00:16:3e:18:59:08 + MTU: 1500 + Bytes received: 1.28GB + Bytes sent: 6.11MB + Packets received: 308930 + Packets sent: 83115 + IP addresses: + inet: 100.115.92.202/28 (global) + inet6: fe80::216:3eff:fe18:5908/64 (link) +``` + +Add the Google Debian container to your list of containers you can install. + +```bash +lxc remote add google https://storage.googleapis.com/cros-containers --protocol=simplestreams +``` + +List the remote containers. + +```bash +lxc remote list +``` + +The output is similar to: + +```output ++-----------------+------------------------------------------------+---------------+-------------+--------+--------+--------+ +| NAME | URL | PROTOCOL | AUTH TYPE | PUBLIC | STATIC | GLOBAL | ++-----------------+------------------------------------------------+---------------+-------------+--------+--------+--------+ +| google | https://storage.googleapis.com/cros-containers | simplestreams | none | YES | NO | NO | ++-----------------+------------------------------------------------+---------------+-------------+--------+--------+--------+ +| images | https://images.linuxcontainers.org | simplestreams | none | YES | NO | NO | ++-----------------+------------------------------------------------+---------------+-------------+--------+--------+--------+ +| local (current) | unix:// | lxd | file access | NO | YES | NO | ++-----------------+------------------------------------------------+---------------+-------------+--------+--------+--------+ +| ubuntu | https://cloud-images.ubuntu.com/releases | simplestreams | none | YES | YES | NO | ++-----------------+------------------------------------------------+---------------+-------------+--------+--------+--------+ +| ubuntu-daily | https://cloud-images.ubuntu.com/daily | simplestreams | none | YES | YES | NO | ++-----------------+------------------------------------------------+---------------+-------------+--------+--------+--------+ +``` + +Using the `images` remote you can create a container with images from [Linux Containers](https://images.linuxcontainers.org/). + +For example, to start Alpine Linux 3.22 run: + +```bash +lxc launch images:alpine/3.22 a1 +``` + +## Configure container auto-start + +From the Termina shell, you can configure the container to start automatically when you start the Linux development environment. + +```bash +# Set the container to start automatically +lxc config set u1 boot.autostart true + +# Set the startup priority. A lower number means a higher priority. +lxc config set u1 boot.autostart.priority 1 +``` + +## Save and restore + +Once you have a container configured with your preferences, you can save it and use the backup to create new containers. + +Follow the steps below to save and restore a container from the Termina shell. + +### Create a Backup + +First, stop the running container to ensure a consistent state. + +```bash +lxc stop u1 +``` + +Save the container to a compressed tar file using the `export` command. + +```bash +lxc export u1 my-ubuntu.tar.gz +``` + +Save the backup file to your Google Drive or another easy to access location. + +### Create a new container from the backup + +Import the backup file to create a new container + +```bash +lxc import my-ubuntu.tar.gz u2 +``` + +Now you have a fresh container named `u2` at the same state you saved the backup. + +## Performance Tips + +For a smoother experience, especially on devices with limited resources, you can monitor and manage your container's performance. + +### Limit Container Resources + +You can configure resource limits for your container from the Termina shell. This can prevent the container from consuming too many system resources. + +Limit the container to 2 CPU cores. + +```bash +lxc config set u1 limits.cpu 4 +``` + +You can confirm using the Linux `lscpu` command and on an 8-core system you will see 4 cores have been moved to offline. + +Limit the container to 2GB of RAM. + +```bash +lxc config set u1 limits.memory 2GB +``` diff --git a/content/learning-paths/mobile-graphics-and-gaming/ams/ams.md b/content/learning-paths/mobile-graphics-and-gaming/ams/ams.md index 2bfc4ed032..4936d5ca4e 100644 --- a/content/learning-paths/mobile-graphics-and-gaming/ams/ams.md +++ b/content/learning-paths/mobile-graphics-and-gaming/ams/ams.md @@ -7,16 +7,26 @@ weight: 2 # 1 is first, 2 is second, etc. # Do not modify these elements layout: "learningpathall" --- -[Arm Performance Studio](https://developer.arm.com/Tools%20and%20Software/Arm%20Mobile%20Studio) (formerly known as Arm Mobile Studio) is a performance analysis tool suite for developers to performance test their applications on devices with Mali-based GPUs. It consists of 5 easy-to-use tools that show you how well your application performs either on off-the-shelf Android devices, or Linux targets. The tools help you to identify problems that might slow down performance, overheat the device, or drain the battery. +[Arm Performance Studio](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio) is a performance analysis tool suite for developers to performance test their applications on devices with Mali-based GPUs. It consists of 4 easy-to-use tools that show you how well your application performs either on off-the-shelf Android devices, or Linux targets. The tools help you to identify problems that might slow down performance, overheat the device, or drain the battery. | Component | Functionality | |----------|-------------| -| [Streamline](https://developer.arm.com/Tools%20and%20Software/Streamline%20Performance%20Analyzer) | Capture a performance profile that shows all the performance counter activity from the device. | -| [Performance Advisor](https://developer.arm.com/Tools%20and%20Software/Performance%20Advisor) | Generate an easy-to-read performance summary from an annotated Streamline capture, and get actionable advice about where you should optimize. | +| [Streamline](https://developer.arm.com/Tools%20and%20Software/Streamline%20Performance%20Analyzer) with [Performance Advisor](https://developer.arm.com/Tools%20and%20Software/Performance%20Advisor) | Capture a performance profile that shows all the performance counter activity from the device. Generate an easy-to-read performance summary from an annotated Streamline capture, and get actionable advice about where you should optimize. | | [Frame Advisor](https://developer.arm.com/Tools%20and%20Software/Frame%20Advisor) | Capture the API calls and rendering from a problem frame and get comprehensive geometry metrics to discover what might be slowing down your application. | | [Mali Offline Compiler](https://developer.arm.com/Tools%20and%20Software/Mali%20Offline%20Compiler) | Analyze how efficiently your shader programs perform on a range of Mali GPUs. | | [RenderDoc for Arm GPUs](https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs) | The industry-standard tool for debugging Vulkan graphics applications, including early support for Arm GPU extensions and Android features. | -Arm Performance Studio is supported on Windows, Linux, and macOS hosts. +## Download and Install Arm Performance Studio + +Arm Performance Studio is supported on Windows, Linux, and macOS hosts. Get the installation package [here](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio#Downloads). Refer to the [Arm Performance Studio install guide](/install-guides/ams/) for installation instructions. + +## Launch the tools + +To open the tools, launch the Performance Studio Hub: + +- On Windows, search for Performance Studio. +- On macOS and Linux, open the Performance Studio application file from the install directory. + + ![Performance Studio Hub](images/ps_hub.png) diff --git a/content/learning-paths/mobile-graphics-and-gaming/ams/fa.md b/content/learning-paths/mobile-graphics-and-gaming/ams/fa.md index 417db5f7f5..77d77a8f6e 100644 --- a/content/learning-paths/mobile-graphics-and-gaming/ams/fa.md +++ b/content/learning-paths/mobile-graphics-and-gaming/ams/fa.md @@ -15,7 +15,13 @@ Build your application, and setup the Android device as described in [Setup task ## Connect to the device -1. Open Frame Advisor and select `New trace` to start a new trace. +1. Launch the Performance Studio Hub and open Frame Advisor. + - On Windows, search for Performance Studio. + - On macOS and Linux, open the Performance Studio application file from the install directory. + + ![Performance Studio Hub](images/ps_hub.png) + +1. Select `New trace` to start a new trace. ![Frame Advisor launch screen](images/fa_launch_screen.png) diff --git a/content/learning-paths/mobile-graphics-and-gaming/ams/images/ps_hub.png b/content/learning-paths/mobile-graphics-and-gaming/ams/images/ps_hub.png new file mode 100644 index 0000000000..c7d2574801 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/ams/images/ps_hub.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/ams/streamline.md b/content/learning-paths/mobile-graphics-and-gaming/ams/streamline.md index d11ebab333..790d97a0fe 100644 --- a/content/learning-paths/mobile-graphics-and-gaming/ams/streamline.md +++ b/content/learning-paths/mobile-graphics-and-gaming/ams/streamline.md @@ -11,6 +11,13 @@ Now that you have seen an [Arm Streamline example capture](/learning-paths/mobil ## Select the device and application in Streamline +1. Launch the Performance Studio Hub and open Streamline. + + - On Windows, search for Performance Studio. + - On macOS and Linux, open the Performance Studio application file from the install directory. + + ![Performance Studio Hub](images/ps_hub.png) + 1. In the Streamline `Start` view, select `Android (adb)` as your device type, then select your device from the list of detected devices. This installs the `gatord` daemon and connects to the device. 1. Wait for the list of available packages to populate, then select the one you wish to profile. diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/1-install-plugin.md b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/1-install-plugin.md new file mode 100644 index 0000000000..e2b10102fb --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/1-install-plugin.md @@ -0,0 +1,45 @@ +--- +title: Introduction to neural graphics and Neural Super Sampling (NSS) +weight: 2 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## What is the Neural Graphics Development Kit? + +The Neural Graphics Development Kit empowers game developers to build immersive mobile gaming experiences using a neural accelerator for post-processing effects like upscaling. By combining Unreal Engine and the ML extensions for Vulkan, these tools allow you to integrate and evaluate AI-based upscaling technologies like Neural Super Sampling (NSS). This Learning Path walks you through the setup and execution of NSS for Unreal Engine. + +## What is Neural Super Sampling? + +NSS is an upscaling technology from Arm, purpose-built for real-time performance and power efficiency on mobile and embedded platforms. + +It uses a compact neural network to: +- Upscale low-resolution frames into high-resolution visuals +- Incorporate temporal data such as motion vectors, depth, and feedback +- Reduce bandwidth usage and GPU load + +Powered by the ML extensions for Vulkan, this new technology delivers smooth, crisp image quality, optimized for **mobile-class hardware** with a **Neural Accelerator** (NX). You’ll be able to render frames at a lower resolution and then upscale them using the technology, which helps you achieve higher frame rates without compromising the visual experience. This is especially useful on mobile, handheld, or thermally limited platforms, where battery life and thermal headroom are critical. It can also deliver improved image quality compared to other upsampling techniques, like spatio-temporal implementations. + +Under the hood, Neural Super Sampling for Unreal Engine (NSS for UE) runs its neural inference through Vulkan using **ML extensions for Vulkan**, which bring machine learning workloads into the graphics pipeline. The Development Kit includes **emulation layers** that simulate the behavior of the extensions on Vulkan compute capable GPUs. These layers allow you to test and iterate without requiring access to NX hardware. + +## Neural Upscaling in Unreal Engine + +With these resources, you can seamlessly integrate NSS into any Unreal Engine project. The setup is designed to work with Vulkan as your rendering backend, and you don’t need to overhaul your workflow - just plug it in and start leveraging ML-powered upscaling right away. The technology is available as a source-code implementation that you will build with Visual Studio. + +## Download required artifacts + +Before you begin, download the required plugins and dependencies. These two repositories contain everything you need to set up NSS for Unreal Engine, including the VGF model file, and the ML Emulations Layers for Vulkan. + +### 1. Download the NSS plugin + +[**Neural Super Sampling Unreal Engine Plugin** → GitHub Repository](https://github.com/arm/neural-graphics-for-unreal) + +Download the latest release package and extract it on your Windows machine. + +### 2. Download the runtime for ML Extensions for Vulkan +[**Unreal NNE Runtime RDG for ML Extensions for Vulkan** → GitHub Repository](https://github.com/arm/ml-extensions-for-vulkan-unreal-plugin). + +Download and extract the release package on your Windows machine. + +Once you’ve extracted both repositories, proceed to the next section to set up your development environment and enable the NSS plugin. \ No newline at end of file diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/2-emulation-layer.md b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/2-emulation-layer.md new file mode 100644 index 0000000000..ff6ec801ab --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/2-emulation-layer.md @@ -0,0 +1,81 @@ +--- +title: Setting up the emulation layers +weight: 3 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Install dependencies + +To run NSS in your Unreal Engine project, install and configure the following: + +- **Vulkan SDK**: Required for development of applications that use Vulkan, and to enable the Vulkan Configurator. The latter sets up the emulation layers used for running ML extensions for Vulkan workloads. +- **ML Emulation Layer for Vulkan**: These layers allows neural inference to run in emulation through Vulkan’s compute backend. They are activated by Vulkan Configurator to run with the Unreal Engine plugin. The layers are included in the `NNERuntimeRDGMLExtensionsForVulkan` zip you downloaded in a previous step. The Vulkan layer configuration activates the ML Emulation Layer for Vulkan, which implements the ML extensions for Vulkan. +- **NSS for Unreal Engine plugins**: These include `NSS` (the inference and model interface) and `NNERuntimeRDGMLExtensionsForVulkan` (which connects Unreal’s Render Dependency Graph to the ML extensions for Vulkan). + +These components allow you to run NSS in Unreal Engine, using ML emulation layers for Vulkan for development and testing. + +## Install Vulkan Software Development Kit + +Go to the [Vulkan SDK landing page](https://vulkan.lunarg.com/sdk/home) and download the SDK Installer for Windows. After you have run the installer, you can move on to the next step. + +## Configure Vulkan Layers + +Vulkan Configurator is a program that will run the emulation layers in the background when you want to utilize them with Unreal Engine. + +To emulate the ML extensions for Vulkan: +1. Launch the **Vulkan Configurator** (bundled with the Vulkan SDK) from the Windows **Start** menu. +2. In the **Apply a Vulkan Loader Configuration** list, right-click and choose **Create a new Configuration**. You can give the new configuration any name, for example `NSS`. +3. Navigate to the **Vulkan Layers Location** tab. +4. Append a user-defined path pointing to the emulation layers you downloaded in the previous section: + ``` + /NNERuntimeRDGMLExtensionsForVulkan/MLEmulationLayerForVulkan + ``` +![Add user-defined Vulkan layers path in Vulkan Configurator#center](./images/load_layers.png "Figure 1: Add Vulkan layer path.") + +5. Ensure the Graph layer is listed *above* the Tensor layer, and that you've set up the configuration scope as shown in the image. + +![Layer configuration showing Graph above Tensor#center](./images/verify_layers.png "Figure 2: Verify layer ordering and scope.") + + +{{% notice %}} +Keep the Vulkan Configurator running to enable the emulation layers during engine execution. +{{% /notice %}} + +## Enable NSS for Unreal Engine + +1. Open Unreal Engine and create a new **Third Person** template project using the **C++** option. + +![Unreal Engine project selection screen showing C++ Third Person template#center](./images/unreal_startup.png "Figure 3: Create a new C++ project in Unreal Engine.") + +2. Open the project in **Visual Studio**. Build it from source through **Build** > **Build Solution** or with `Ctrl+Shift+B`. + +After the build is finished, open your project in Unreal Engine. + +## Change Unreal’s Rendering Interface to Vulkan + +By default, Unreal uses DirectX. Instead, you need to choose Vulkan as the default RHI: +1. Go to: + ``` + Project Settings > Platform > Windows > Targeted RHIs > Default RHI + ``` +2. Select **Vulkan**. +3. Restart Unreal Engine to apply the change. + +![Project Settings with Vulkan selected as Default RHI under Targeted RHIs#center](./images/targeted_rhis.png "Figure 4: Set Vulkan as the default RHI.") + + +## Add and enable the plugins + +1. Open your project directory in Windows explorer, and create a new folder called `Plugins`. +2. Copy the downloaded and extracted `.zips` into the new directory: + - `NNERuntimeRDGMLExtensionsForVulkan` + - `NSS` +3. Re-open Unreal Engine. When prompted, confirm plugin integration. +4. Rebuild your project in Visual Studio from source. +5. Verify the installation by opening the Plugins view in Unreal Engine, and making sure the checkbox is selected for both `NSS` and `NNERuntimeRDGMLExtensionsForVulkan` as shown. Restart Unreal Engine if prompted. + +![Unreal Engine plugins window showing NSS and NNERuntimeRDGMLExtensionsForVulkan enabled#center](./images/verify_plugin_enabled.png "Figure 5: Verify plugin installation in Unreal Engine.") + +With the emulation layers and plugins configured, you're ready to run Neural Super Sampling in Unreal Engine. Continue to the next section to test the integration. \ No newline at end of file diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/3-run-example.md b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/3-run-example.md new file mode 100644 index 0000000000..b1c3304a64 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/3-run-example.md @@ -0,0 +1,42 @@ +--- +title: Run the example +weight: 4 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Start the level and verify NSS + +Press the green **Play** button to start the level. To verify NSS is running, you can run this command in Unreal: + ``` + ShowFlag.VisualizeTemporalUpscaler 1 + ``` +You’ll see **NSS** listed in the rendering summary. + +{{% notice %}} +In **Project Settings > Plugins > Neural Super Sampling**, you can view and configure the active neural network model being used. +{{% /notice %}} + +Run `ShowFlag.VisualizeTemporalUpscaler 0` to disable the overview. To visualize the NSS model output in real-time, run the following command: + ``` + r.NSS.Debug 2 + ``` + +This will add real-time views showing the model’s processed outputs, such as predicted filter coefficients and feedback, as below. In the [Wrapping up section](/learning-paths/mobile-graphics-and-gaming/nss-unreal/6-wrapping-up), you will find links to learn more about what the debug outputs mean. + +![Debug view of Neural Super Sampling model output in Unreal Engine#center](./images/nss_debug.png "Figure 6: Visualize NSS model debug output in real time.") + +## NSS model on Hugging Face + +The model that powers NSS is published on Hugging Face in the [VGF format](https://github.com/arm/ai-ml-sdk-vgf-library). This format is optimized for inference via ML extensions for Vulkan. + +Visit the [NSS model page on Hugging Face](https://huggingface.co/Arm/neural-super-sampling/) + +On this landing page, you can read more about the model, and learn how to run a test case - a _scenario_ - using the ML SDK for Vulkan. + +## Result + +You now have Neural Super Sampling integrated and running inside Unreal Engine. This setup provides a real-time testbed for neural upscaling. + +Proceed to the next section to debug your frames using RenderDoc, or move on to the final section to explore more resources on the technology behind NSS. diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/5-renderdoc.md b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/5-renderdoc.md new file mode 100644 index 0000000000..0a099cc209 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/5-renderdoc.md @@ -0,0 +1,70 @@ +--- +title: Using RenderDoc for Debugging and Analysis +weight: 6 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Why use RenderDoc with Neural Super Sampling? + +As you integrate neural upscaling techniques into your game, visual debugging and performance profiling become essential. RenderDoc is a powerful frame capture and analysis tool that allows you to step through a frame, inspect Vulkan API calls, view shader inputs and outputs, and understand the state of resources. Arm has released some additional features, which are captured in [RenderDoc for Arm GPUs](https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs). + +You might want to use RenderDoc when: + +- You see unexpected visual output or want to step through the frame rendering process. +- You need to analyze the sequence of Vulkan API calls made by the engine. +- You’re inspecting memory usage or the state of specific GPU resources. +- You want to validate your data graph pipeline execution or identify synchronization issues. + +## Install Arm Performance Studio + +To access RenderDoc for Arm GPUs containing the added features with ML extensions for Vulkan, you should install Arm Performance Studio. Download it from the [Arm Performance Studio Downloads](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio#Downloads). The minimum version to use is `2025.4` + +Refer to the [Arm Performance Studio install guide](/install-guides/ams) to set it up. + +Upon a finished installation, you can find the installed version of RenderDoc for Arm GPUs using the Windows **Start** menu. + +## Use in Unreal Engine + +### 1. Configure the executable path + +To enable integration with Unreal Engine: + +1. Open your Unreal Engine project. +2. Go to **Edit > Project Settings > Plugins > RenderDoc**. +3. Under **Path to RenderDoc executable**, enter the full path to the directory where the `qrenderdoc.exe` binary is located. +4. Restart Unreal Engine for the setting to take effect. + +![RenderDoc plugin path setup in Unreal Engine#center](./images/renderdoc_plugin_ue.png "Figure 7: Set the RenderDoc executable path in Unreal Engine plugin settings.") + + +## 2. Ways to capture + +### Option 1: Attach to the Running Editor + +1. Launch RenderDoc for Arm GPUs separately. +2. Go to **File > Attach to Running Instance**. +3. A list of running Vulkan-enabled applications will appear. Select the hostname that corresponds to the UE Editor session (with UI) or use Standalone Running App (see image below). +4. Click **Connect to App**. +5. Click **Capture Frame Immediately** or set up the capture settings otherwise. + +### Option 2: Use plugin inside Unreal Engine +1. Open your project and scene where you want to perform a capture. +2. Click the **RenderDoc Capture** button in the Level Viewport (see image below). + +![RenderDoc capture button in Unreal Engine Level Viewport, or Attach to Running Instance #center](./images/renderdoc.png "Figure 8: Two options to capture frames using RenderDoc with Unreal Engine.") + +## 3. Capture a Frame + +1. Return to Unreal Engine and **Play in Editor** to launch your game level. +2. In RenderDoc for Arm GPUs, click **Capture Frame Now** (camera icon) or press `F12` while the UE window is focused. +3. Once captured, double-click the frame in RenderDoc to open a detailed breakdown of the GPU workload. + +You can now: + +- Step through draw calls and dispatches. +- Inspect bound resources, descriptor sets, and shaders. +- Explore the execution of your data graph pipeline frame-by-frame. + +If you want to learn more about RenderDoc for Arm GPUs, you can check out the [Debug With RenderDoc User Guide](https://developer.arm.com/documentation/109669/latest). diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/6-wrapping-up.md b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/6-wrapping-up.md new file mode 100644 index 0000000000..4aec40d945 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/6-wrapping-up.md @@ -0,0 +1,25 @@ +--- +title: Wrapping up +weight: 7 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +With the NSS for UE plugin, you’re set up to explore real-time neural graphics with Neural Super Sampling. This toolchain gives you direct access to state-of-the-art upscaling powered by machine learning. + +You’ve covered: +- Understanding the role of **ML Extensions for Vulkan** and how emulation layers let you run everything without needing dedicated ML hardware +- Installing the **Vulkan SDK** and enabling ML Emulation Layer for Vulkan using Vulkan Configurator +- Setting up the **NSS for Unreal Engine** plugins, and visualizing the model output +- Inspecting the **NSS model** in VGF on Hugging Face + +This ecosystem is built for developers who want to push boundaries - whether on flagship mobile SoCs or desktop dev kits. NSS is designed to give you better image quality without the complexity of building custom ML infrastructure. + +To learn more about the different aspects in this Learning Path, check out the following resources: +- [Neural Graphics Development Kit landing page](https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics-for-mobile) +- [NSS Use Case Guide](https://developer.arm.com/documentation/111009/latest/) +- [Debugging NSS content with RenderDoc](https://developer.arm.com/documentation/109669/latest) +- [Learning Path: Get started with neural graphics using ML Extensions for Vulkan](/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample) + +Happy building - and welcome to the future of neural upscaling in Unreal! diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/_index.md b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/_index.md new file mode 100644 index 0000000000..9669ac579e --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/_index.md @@ -0,0 +1,63 @@ +--- +title: Neural Super Sampling in Unreal Engine + +minutes_to_complete: 30 + +who_is_this_for: This is an introductory topic for developers experimenting with neural graphics using Unreal Engine® and ML Extensions for Vulkan®. + + +learning_objectives: + - Understand how Arm enables neural graphics for game development + - Configure ML extensions for Vulkan emulation + - Enable Neural Super Sampling (NSS) in Unreal Engine + - Run and visualize real-time upscaling with NSS + + +prerequisites: + - Windows 11 + - Unreal Engine 5.5 (Templates and Feature Pack enabled) + - Visual Studio 2022 (with Desktop Development with C++ and .NET desktop build tools) + + +author: Annie Tallund + +### Tags +skilllevels: Introductory +subjects: ML +armips: + - Mali +tools_software_languages: + - Unreal Engine + - Vulkan SDK + - Visual Studio +operatingsystems: + - Windows + + + +further_reading: + - resource: + title: Neural Graphics Development Kit + link: https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics-for-mobile + type: website + - resource: + title: NSS Use Case Guide + link: https://developer.arm.com/documentation/111009/latest/ + type: documentation + - resource: + title: RenderDoc for Arm GPUs + link: https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs + type: documentation + - resource: + title: How Arm Neural Super Sampling works + link: https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-arm-neural-super-sampling-works + type: blog + + + +### FIXED, DO NOT MODIFY +# ================================================================================ +weight: 1 # _index.md always has weight of 1 to order correctly +layout: "learningpathall" # All files under learning paths have this same wrapper +learning_path_main_page: "yes" # This should be surfaced when looking for related content. Only set for _index.md of learning path content. +--- diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/_next-steps.md b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/_next-steps.md new file mode 100644 index 0000000000..c3db0de5a2 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/_next-steps.md @@ -0,0 +1,8 @@ +--- +# ================================================================================ +# FIXED, DO NOT MODIFY THIS FILE +# ================================================================================ +weight: 21 # Set to always be larger than the content in this path to be at the end of the navigation. +title: "Next Steps" # Always the same, html page title. +layout: "learningpathall" # All files under learning paths have this same wrapper for Hugo processing. +--- diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/add_plugin_folder.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/add_plugin_folder.png new file mode 100644 index 0000000000..3bacc9afc5 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/add_plugin_folder.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/confirm_layers.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/confirm_layers.png new file mode 100644 index 0000000000..86ed7de624 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/confirm_layers.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/load_layers.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/load_layers.png new file mode 100644 index 0000000000..84c51856a9 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/load_layers.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/nss_debug.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/nss_debug.png new file mode 100644 index 0000000000..135a3ed33e Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/nss_debug.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/renderdoc.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/renderdoc.png new file mode 100644 index 0000000000..8ddf9a322d Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/renderdoc.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/renderdoc_plugin_ue.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/renderdoc_plugin_ue.png new file mode 100644 index 0000000000..ec407c7646 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/renderdoc_plugin_ue.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/targeted_rhis.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/targeted_rhis.png new file mode 100644 index 0000000000..16d662ed96 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/targeted_rhis.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/unreal_startup.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/unreal_startup.png new file mode 100644 index 0000000000..6d2acfdce9 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/unreal_startup.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/verify_layers.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/verify_layers.png new file mode 100644 index 0000000000..51e7a45b62 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/verify_layers.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/verify_plugin_enabled.png b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/verify_plugin_enabled.png new file mode 100644 index 0000000000..96d8e1e267 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/nss-unreal/images/verify_plugin_enabled.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/1-introduction.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/1-introduction.md new file mode 100644 index 0000000000..133a0df124 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/1-introduction.md @@ -0,0 +1,41 @@ +--- +title: Run neural graphics workloads with ML Extensions for Vulkan +weight: 2 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## What is neural graphics, and why does it matter for real-time rendering? + +Neural graphics combines real-time rendering with the power of machine learning to enhance visual quality and performance. By integrating ML techniques like neural upscaling directly into the GPU pipeline, developers can achieve next-gen fidelity and efficiency. This is especially valuable on mobile and embedded devices, where power efficiency is critical. + +## How do ML Extensions for Vulkan support neural graphics workloads? + +Vulkan's data graph pipelines, introduced through the `VK_ARM_data_graph` and `VK_ARM_tensors` extensions, bring structured compute graph execution to the Vulkan API by introducing support for processing tensors. These pipelines are designed to execute ML inference workloads efficiently using SPIR-V-defined graphs. + +To help developers adopt these features, the Tensor and Data Graph Vulkan Samples offer hands-on demonstrations. + +These samples address key challenges in ML integration, such as: + +- **Understanding graph-based compute with Vulkan**: See how compute workloads can be structured using explicit graph topologies. +- **Demystifying ML inference in real-time rendering**: Learn how ML fits into the graphics pipeline. + +Samples range from basic setups to more advanced features like tensor aliasing and compute shader integration. + +This Learning Path walks you through setting up and running the first sample, `simple_tensor_and_data_graph`. + + +### Why use ML Extensions for Vulkan for game and graphics development? + +As a game developer, you've probably noticed the rising demand for smarter, more immersive graphics — but also the increasing strain on GPU resources, especially on mobile. Vulkan's traditional pipelines give you fine-grained control, but finding the right tooling to integrate machine learning has been a challenge. That’s where the new ML extensions for Vulkan come in. + +Arm’s `VK_ARM_tensors` and `VK_ARM_data_graph` extensions give you native Vulkan support for executing neural networks on the GPU — using structured tensors and data graph pipelines. Instead of chaining compute shaders to simulate ML models, you can now express them as dataflow graphs in SPIR-V and run them more efficiently. This opens the door to using AI techniques right alongside the graphics pipeline. + +And while ML has found success in image classification and LLMs, these extensions are designed from the ground up for gaming and graphics workloads — prioritizing predictable execution, GPU compatibility, and memory efficiency. With built-in support for tensor formats and pipeline sessions, the extensions are optimized for developers looking to blend traditional rendering with machine learning on Vulkan. + +Arm provides emulation layers for development on any modern Vulkan-capable hardware, and PyTorch support is available for model conversion workflows. + +For an example of real-time upscaling, see the Learning Path [**Neural Super Sampling with Unreal Engine**](/learning-paths/mobile-graphics-and-gaming/nss-unreal/). + +With the Vulkan Samples, you can experiment directly with these ideas. Move on to the next section to set up your machine for running the samples. diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/2-ml-ext-for-vulkan.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/2-ml-ext-for-vulkan.md new file mode 100644 index 0000000000..ac2f4cd193 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/2-ml-ext-for-vulkan.md @@ -0,0 +1,71 @@ +--- +title: Setting up the ML Emulation Layers for Vulkan +weight: 3 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Overview + +To run the Vulkan Samples, you first need to set up your development environment. + +This setup involves two main steps: + +* Install the required tools on your development machine +* Download the ML emulation layers for Vulkan, which simulate the `VK_ARM_data_graph` and `VK_ARM_tensors` extensions + +## Install required tools for development + +Before building and running the samples, ensure the following tools are installed on your development machine: + +- CMake (version 3.12 or later) +- Python 3 +- Git + +To verify your installation, run the following commands: + +```bash +cmake --version +python3 --version +git --version +``` + +Each command should print the installed version of the tool. + +### Install Vulkan Software Development Kit + +Go to the [Getting Started with the Windows Vulkan SDK](https://vulkan.lunarg.com/sdk/home) and download the SDK Installer for Windows. This installs **Vulkan Configurator** which is used to run the emulation layers. + +{{% notice Note %}} +You must use a version >= 1.4.321 for the Vulkan SDK. +{{% /notice %}} + +## Download the emulation layers + +For this Learning Path, a pre-built of package of the emulation layers is available. Download them by clicking the link. + +[ML Emulation Layer for Vulkan](https://www.arm.com/-/media/Files/developer/MLEmulationLayerForVulkan) + +Extract the downloaded file in a location of your choice. You’re now ready to enable the emulation layers in Vulkan Configurator. + +## Enable the emulation layers in Vulkan Configurator + +Next, enable the emulation layers using the Vulkan Configurator to simulate the `VK_ARM_data_graph` and `VK_ARM_tensors` extensions. Open **Vulkan Configurator**. + +Under the **Vulkan Layers Location** tab, add the path to your `MLEmulationLayerForVulkan` folder. + +In the **Apply a Vulkan Loader Configuration** list, right-click and choose **Create a new Configuration**. You can give the new configuration any name, for example `tensor_and_data_graph`. + +![Screenshot of the Vulkan Configurator showing the Vulkan Layers Location tab, where the emulation layer path (MLEmulationLayerForVulkan) is added to enable VK_ARM_data_graph and VK_ARM_tensors alt-text#center](./images/load_layers.png "Add emulation layers in Vulkan Configurator") + +Ensure that the **Graph** layer is listed above the **Tensor** layer. + +![Screenshot showing the Graph layer listed above the Tensor layer in the Vulkan Configurator. alt-text#center](./images/verify_layers.png "Reorder layers in Vulkan Configurator") + +{{% notice Important %}} +Keep Vulkan Configurator running while you run the Vulkan samples. +{{% /notice %}} + +With the emulation layers configured, you're ready to build the Vulkan Samples. Continue to the next section to get started. + diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/3-first-sample.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/3-first-sample.md new file mode 100644 index 0000000000..697430837c --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/3-first-sample.md @@ -0,0 +1,68 @@ +--- +title: Simple Tensor and Data Graph +weight: 4 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Understand how the Simple Tensor and Data Graph sample works + +The **Simple Tensor and Data Graph** sample is your starting point for working with the ML extensions for Vulkan. It demonstrates how to execute a simple neural network using a data graph pipeline — specifically, a 2D average pooling operation. + +## Clone the Vulkan Samples + +With the environment set up, you can now grab the sample code. These examples are maintained in a fork of the Khronos Group's repository. + +```bash +git clone --recurse-submodules https://github.com/ARM-software/Vulkan-Samples --branch tensor_and_data_graph +cd Vulkan-Samples +``` + +This repository includes the framework and samples showcasing the ML extensions for Vulkan. + +## Build the Vulkan Samples + +You're now ready to compile the project. From the root of the repository: + +{{% notice Note %}} +Be sure to run the commands in [Developer Mode](https://learn.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development#activate-developer-mode) to avoid permission issues. +{{% /notice %}} + +Generate Visual Studio project files using CMake: +```bash +cmake -G "Visual Studio 17 2022" -A x64 -S . -Bbuild/windows +``` +Finally, compile the `vulkan_samples` target in Release mode: + +```bash +cmake --build build/windows --config Release --target vulkan_samples +``` +## Run the Simple Tensor and Data Graph sample + +Run the built executable using the following command: + +```bash +build\windows\app\bin\Release\AMD64\vulkan_samples.exe sample simple_tensor_and_data_graph +``` + +This should open a new window visualizing the operation. In this sample, a minimal Vulkan application sets up a data graph pipeline configured to process a small neural network. + +The sample creates input and output tensors, binds them using descriptor sets and pipeline layouts, and supplies a SPIR-V module that defines the network operation. Finally, it records and dispatches commands to execute the pipeline — and visualizes the results in real time. More details about what's going on under the hood can be found in the [documentation](https://arm-software.github.io/Vulkan-Samples/samples/extensions/tensor_and_data_graph/simple_tensor_and_data_graph/README.html). + +## Summary and next steps + +By running this sample, you’ve stepped through a complete Vulkan data graph pipeline powered by the ML extensions for Vulkan. You’ve created tensors, set up descriptors, built a SPIR-V-encoded ML graph, and dispatched inference — all without needing custom shaders. This sets the foundation for neural graphics. As you explore the remaining samples, you’ll see how this core pattern extends into real-world graphics scenarios. + +As a next step, you can explore the remaining samples for the data graph pipeline. The documentation sits in each sample's directory, available under `samples/extensions/tensor_and_data_graph/` in the repository. + +## Overview of Additional Samples + +| Sample Name | Description | Focus Area | +|-------------------------------------|----------------------------------------------------------------------------------------------|------------------------------------------| +| **Graph Constants** | Shows how to include constants like weights and biases into the data graph pipeline using tensors| Constant tensor injection | +| **Compute Shaders with Tensors** | Demonstrates how to feed tensor data into or out of data graph pipelines using compute shaders | Shader interoperability | +| **Tensor Image Aliasing** | Demonstrates tensor aliasing with Vulkan images to enable zero-copy workflows | Memory-efficient data sharing | +| **Postprocessing with VGF** | Explores using VGF format, which contains SPIR-V, input, output and constant data used to run a data graph pipeline. | Neural network model | + +Next, you'll review additional tools to help you work with ML extensions for Vulkan in your own development environment. \ No newline at end of file diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/4-scenario-runner.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/4-scenario-runner.md new file mode 100644 index 0000000000..f147f290cf --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/4-scenario-runner.md @@ -0,0 +1,35 @@ +--- +title: Running a test with the Scenario Runner +weight: 5 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Overview + +In this section, you’ll explore how to run a complete inference test using the **Scenario Runner** from Arm’s ML SDK for Vulkan. You’ll also learn what’s provided on Arm’s Hugging Face page, including downloadable binaries and assets that demonstrate the ML extensions for Vulkan in action. + +## About the ML SDK for Vulkan + +The SDK provides a collection of tools and runtime components that help you integrate neural networks into Vulkan-based applications. While the ML extensions for Vulkan (`VK_ARM_data_graph` and `VK_ARM_tensors`) define the runtime interface, the SDK provides a practical workflow for converting, packaging, and deploying ML models in real-time applications such as games. + +### SDK Component Summary + +| Component | Description | Usage Context | GitHub link +|------------------|------------------------------------------------------------------------------------------------------|-------------------------------------|--------------| +| **Model Converter** | Converts TOSA IR into SPIR-V graphs and packages them into `.vgf` files for runtime execution. | Used in asset pipelines for model deployment | https://github.com/arm/ai-ml-sdk-model-converter | +| **VGF Library** | Lightweight runtime decoder for `.vgf` files containing graphs, constants, and shaders. | Integrate into game engine to load/use graphs | https://github.com/arm/ai-ml-sdk-vgf-library | +| **Scenario Runner** | Executes ML workloads declaratively using JSON-based scenario descriptions. | Ideal for rapid prototyping and validation | https://github.com/arm/ai-ml-sdk-scenario-runner | +| **Emulation Layer** | Vulkan layer that emulates data graph and tensor extensions using compute shaders. | For testing on devices without native ML extensions for Vulkan support | https://github.com/arm/ai-ml-emulation-layer-for-vulkan | + + +## About the Hugging Face release + +Visit the [NSS model page on Hugging Face](https://huggingface.co/Arm/neural-super-sampling) + +The landing page contains a minimal example - a _scenario_ - to run NSS with an actual frame. It contains a Windows-compatible Scenario Runner binary, the VGF model, and a single frame of input and expected output data. This allows you to run an end-to-end flow, and the landing page provides resources to explore the VGF model in more detail. + +## Next steps + +In the following section, you’ll explore how to debug and inspect the workloads in this Learning Path using RenderDoc. diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/5-renderdoc.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/5-renderdoc.md new file mode 100644 index 0000000000..569e22e741 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/5-renderdoc.md @@ -0,0 +1,66 @@ +--- +title: Use RenderDoc to debug and analyze workloads +weight: 6 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Debug and profile workloads with RenderDoc + + +Integrating machine learning into real-time rendering makes frame-level inspection and performance analysis critical. RenderDoc helps you visualize and debug the workloads by letting you step through frames, examine tensors, and inspect Vulkan API calls. + +RenderDoc is a powerful GPU frame capture tool that lets you: + +- Step through a frame’s rendering process +- Inspect Vulkan API calls +- View shader inputs and outputs +- Examine GPU resource states and memory usage + +## When to use RenderDoc with the samples + +RenderDoc can help in scenarios such as: + +- Diagnosing unexpected visual output by stepping through draw calls +- Analyzing the order and behavior of Vulkan API calls +- Investigating memory consumption or GPU resource state +- Validating execution of data graph pipelines or identifying sync issues + +## Installing Arm Performance Studio (includes RenderDoc) + +To use RenderDoc with ML extensions, install the Arm-customized version via Performance Studio: + +1. **Download Arm Performance Studio** from the [Arm Developer website](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio#Downloads). The minimum version to use is `2025.4` +2. Run the installer: + `Arm_Performance_Studio__windows_x86-64.exe` +3. Follow the installation instructions. + +Once installed, launch RenderDoc for Arm GPUs via the Windows Start menu. + +## Capture Vulkan frames with RenderDoc + +You can capture and inspect Vulkan Samples that use ML extensions for Vulkan, including API calls such as `vkCreateTensorARM` and structures like `VK_STRUCTURE_TYPE_TENSOR_DESCRIPTION_ARM`. + +RenderDoc is especially useful for visualizing tensor operations, inspecting resource bindings, and verifying correct data graph pipeline execution. + +## Capture with RenderDoc + +1. **Open RenderDoc**, and in the main window, go to the **Launch Application** section. +2. Configure the following fields: + - **Executable Path**: Path to the built executable `vulkan_samples.exe`. + - **Working Directory**: Path to the root of the Vulkan Samples project. + - **Command-line Arguments**: + ``` + sample simple_tensor_and_data_graph + ``` + You can substitute `simple_tensor_and_data_graph` with any of the other sample names as needed. +3. Click **Launch**. The selected sample will start running. +4. Once the application window is active, press **F12** to capture a frame. +5. After the frame is captured, it will appear in RenderDoc’s capture list. Double-click it to explore the captured frame and inspect ML extensions for Vulkan calls in detail. + +## Learn more + +This workflow enables close inspection of how ML graphs are built and executed within Vulkan — an essential tool when optimizing pipelines or debugging integration issues. If you want to learn more about RenderDoc for Arm GPUs, you can check out the [Debug With RenderDoc User Guide](https://developer.arm.com/documentation/109669/latest). + +Move on to the next section to review further resources on what is new, and what is coming. \ No newline at end of file diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/6-wrapping-up.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/6-wrapping-up.md new file mode 100644 index 0000000000..15b64371cb --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/6-wrapping-up.md @@ -0,0 +1,29 @@ +--- +title: Wrapping up +weight: 7 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## What you’ve learned and what’s next + +With these tools and samples, you're ready to explore and experiment with neural graphics on Vulkan. Whether you're integrating neural super sampling into a full-scale game or learning to build ML pipelines from scratch, the Neural Graphics Development kit provides practical, extensible building blocks for real-time workloads. + + +In this Learning Path, you’ve: + +- Explored the **Vulkan Samples** to understand data graph pipeline structure +- Reviewed ML integration workflows using the **ML SDK for Vulkan** and the **VGF** format +- Debugged and analyzed the workloads using **RenderDoc** + +These components are designed to accelerate your development, provide insight into neural upscaling pipelines, and support experimentation with cutting-edge GPU features. + +Explore more: + +- [Neural Graphics Development Kit landing page](https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics-for-mobile) - overview of Arm's tools for neural graphics +- [Vulkan Samples](https://github.com/ARM-software/Vulkan-Samples) - demos for the ML Extensions for Vulkan +- [Building for Tomorrow: Try Arm Neural Super-Sampling Today with ML Extensions for Vulkan and Unreal](https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-to-access-arm-neural-super-sampling) - getting started with the NSS use-case +- [How Arm Neural Super Sampling Works](https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-arm-neural-super-sampling-works) - deep dive into the NSS use-case + +Happy coding, and welcome to the future of real-time neural graphics! diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/_index.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/_index.md new file mode 100644 index 0000000000..35ada0f44f --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/_index.md @@ -0,0 +1,65 @@ +--- +title: Get started with neural graphics using ML Extensions for Vulkan® + +minutes_to_complete: 30 + +who_is_this_for: This is an advanced topic for engine developers interested in learning about neural graphics using ML Extensions for Vulkan. + +learning_objectives: + - Explain the purpose of neural graphics and the role of ML Extensions for Vulkan + - Set up the ML Emulation Layers for Vulkan to enable the extensions + - Run a sample Vulkan application that uses the extensions + - Debug the flow using RenderDoc + +prerequisites: + - Windows 11 development machine + - Visual Studio 2022 + - Visual Studio workload - Desktop development with C++ + - Visual Studio workload - .NET desktop build tools + + + +author: Annie Tallund + +### Tags +skilllevels: Advanced +subjects: ML +armips: + - Mali +tools_software_languages: + - Vulkan + - RenderDoc +operatingsystems: + - Windows + + +further_reading: + - resource: + title: Neural Graphics Development Kit + link: https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics-for-mobile + type: website + - resource: + title: ML SDK for Vulkan + link: https://github.com/arm/ai-ml-sdk-for-vulkan + type: website + - resource: + title: Vulkan Samples + link: https://github.com/ARM-software/Vulkan-Samples + type: website + - resource: + title: RenderDoc for Arm GPUs + link: https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs + type: documentation + - resource: + title: How Arm Neural Super Sampling works + link: https://community.arm.com/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-arm-neural-super-sampling-works + type: blog + + + +### FIXED, DO NOT MODIFY +# ================================================================================ +weight: 1 # _index.md always has weight of 1 to order correctly +layout: "learningpathall" # All files under learning paths have this same wrapper +learning_path_main_page: "yes" # This should be surfaced when looking for related content. Only set for _index.md of learning path content. +--- diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/_next-steps.md b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/_next-steps.md new file mode 100644 index 0000000000..c3db0de5a2 --- /dev/null +++ b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/_next-steps.md @@ -0,0 +1,8 @@ +--- +# ================================================================================ +# FIXED, DO NOT MODIFY THIS FILE +# ================================================================================ +weight: 21 # Set to always be larger than the content in this path to be at the end of the navigation. +title: "Next Steps" # Always the same, html page title. +layout: "learningpathall" # All files under learning paths have this same wrapper for Hugo processing. +--- diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/images/load_layers.png b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/images/load_layers.png new file mode 100644 index 0000000000..84c51856a9 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/images/load_layers.png differ diff --git a/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/images/verify_layers.png b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/images/verify_layers.png new file mode 100644 index 0000000000..51e7a45b62 Binary files /dev/null and b/content/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/images/verify_layers.png differ diff --git a/data/stats_current_test_info.yml b/data/stats_current_test_info.yml index 95124a0643..d134aff4ab 100644 --- a/data/stats_current_test_info.yml +++ b/data/stats_current_test_info.yml @@ -1,5 +1,5 @@ summary: - content_total: 393 + content_total: 396 content_with_all_tests_passing: 0 content_with_tests_enabled: 61 sw_categories: diff --git a/data/stats_weekly_data.yml b/data/stats_weekly_data.yml index f396902423..573d166365 100644 --- a/data/stats_weekly_data.yml +++ b/data/stats_weekly_data.yml @@ -7131,3 +7131,124 @@ avg_close_time_hrs: 0 num_issues: 26 percent_closed_vs_total: 0.0 +- a_date: '2025-08-11' + content: + automotive: 3 + cross-platform: 34 + embedded-and-microcontrollers: 44 + install-guides: 105 + iot: 6 + laptops-and-desktops: 38 + mobile-graphics-and-gaming: 35 + servers-and-cloud-computing: 131 + total: 396 + contributions: + external: 98 + internal: 522 + github_engagement: + num_forks: 30 + num_prs: 19 + individual_authors: + adnan-alsinan: 2 + alaaeddine-chakroun: 2 + albin-bernhardsson: 1 + albin-bernhardsson,-julie-gaskin: 1 + alex-su: 1 + alexandros-lamprineas: 1 + andrew-choi: 2 + andrew-kilroy: 1 + annie-tallund: 4 + arm: 3 + arnaud-de-grandmaison: 5 + aude-vuilliomenet: 1 + avin-zarlez: 1 + barbara-corriero: 1 + basma-el-gaabouri: 1 + ben-clark: 1 + bolt-liu: 2 + brenda-strech: 1 + bright-edudzi-gershon-kordorwu: 1 + chaodong-gong: 1 + chen-zhang: 1 + chenying-kuo: 1 + christophe-favergeon: 1 + christopher-seidl: 7 + cyril-rohr: 1 + daniel-gubay: 1 + daniel-nguyen: 2 + david-spickett: 2 + dawid-borycki: 33 + diego-russo: 2 + dominica-abena-o.-amanfo: 1 + elham-harirpoush: 2 + florent-lebeau: 5 + "fr\xE9d\xE9ric--lefred--descamps": 2 + gabriel-peterson: 5 + gayathri-narayana-yegna-narayanan: 2 + georgios-mermigkis: 1 + geremy-cohen: 3 + gian-marco-iodice: 1 + graham-woodward: 1 + han-yin: 1 + iago-calvo-lista: 1 + james-whitaker: 1 + jason-andrews: 106 + jeff-young: 1 + joana-cruz: 1 + joe-stech: 6 + johanna-skinnider: 2 + jonathan-davies: 2 + jose-emilio-munoz-lopez: 1 + julie-gaskin: 5 + julien-jayat: 1 + julien-simon: 1 + julio-suarez: 6 + jun-he: 1 + kasper-mecklenburg: 1 + kieran-hejmadi: 12 + koki-mitsunami: 2 + konstantinos-margaritis: 8 + kristof-beyls: 1 + leandro-nunes: 1 + liliya-wu: 1 + mark-thurman: 1 + martin-ma: 1 + masoud-koleini: 1 + mathias-brossard: 1 + michael-hall: 5 + na-li: 1 + nader-zouaoui: 2 + nikhil-gupta: 1 + nina-drozd: 1 + nobel-chowdary-mandepudi: 6 + odin-shen: 9 + owen-wu: 2 + pareena-verma: 46 + paul-howard: 3 + peter-harris: 1 + pranay-bakre: 5 + preema-merlin-dsouza: 1 + przemyslaw-wirkus: 2 + qixiang-xu: 1 + rani-chowdary-mandepudi: 1 + rin-dobrescu: 1 + roberto-lopez-mendez: 2 + ronan-synnott: 45 + shuheng-deng: 1 + thirdai: 1 + tianyu-li: 2 + tom-pilar: 1 + uma-ramalingam: 1 + varun-chari: 2 + visualsilicon: 1 + waheed-brown: 1 + willen-yang: 1 + william-liang: 1 + ying-yu: 3 + yiyang-fan: 1 + zach-lasiuk: 2 + zhengjun-xing: 2 + issues: + avg_close_time_hrs: 0 + num_issues: 28 + percent_closed_vs_total: 0.0