Skip to content

Augmented & Virtual Reality (AR & VR)

Arno Hartholt edited this page Feb 24, 2026 · 5 revisions

The VHToolkit can be used to interact with life-sized virtual humans in AR and VR. The provided sample targets the Meta Quest 3 headset in Passthrough mode. The User Setup section below details how to load and run the binary application on the headset. The Developer Setup section details how to set up the Unity project for further development.

User Setup

Requirements

  • Meta Quest 3 headset, set up and connected to your Meta Horizon mobile app
  • Quest headset linked to Windows development PC with USB cable
  • Real world environment scanned and saved as a Space (Settings > Environment setup > Space setup)
  • Internet connection

Install Example / Side Load APK

To run the provided VHToolkit sample on the Quest, the APK file (binary / executable) needs to be side loaded to the headset:

  • Ensure your Quest has Developer Mode enabled via the Quest Horizon mobile app
    • Note, you may need to update to a developer Meta account for the existing account associated with the app and headset.
  • Download the VHToolkit APK from the Release page
  • Download and install SideQuest Advanced Installer onto PC: SideQuest
    • Note: you can decline the creation of a SideQuest account
    • Note: relaunch the app if it hangs on update with first launch
  • Complete all the connection requirements given by the SideQuest app for a successful “green” status.
    • Note: the Developer Setup steps described below fulfill the requirements
    • Note: the in-headset prompts may not appear until the headset is reset, and the prompts may appear twice in a row before they become set
  • Click the folder icon to select the APK to stage it and install it on the device
  • In the headset, navigate to the Library > Unknown sources to find and launch the APK

If you have any permission issues, it may be helpful to set up the headset fully for development, see the section below.

Run Example

  • For first time use, grant permissions to use Spatial Data and the Microphone.
  • The sample starts with a laser pointer from the right controller. Point it at any flat real world surface and the pointer should turn green. Pull the trigger to spawn the character.
    • Note: hold the A button while pulling the trigger to use ElevenLabs TTS instead of AWS Polly.
  • With the right controller, point at any of the pre-populated questions on the left, and pull the trigger to ask that questions.
  • To use speech recognition to talk to the character directly, either 1) point to the lowest button, Talk to Character, and pull the trigger, or 2) push the Y button on the left controller. You don’t need to keep holding the trigger or button. The system will automatically detect when you are done speaking. Speech recognition remains active until you click the button again.
  • To move the character in the environment, hold both the trigger and the grip button on the right controller, point towards the new location, and release both.
  • To select a different character, hold up the left controller, and point towards one of the character portraits with the right controller, and pull the right trigger. To spawn the Army version of Kevin instead of the civilian version, press the X button on the left controller while selecting him.

Known Issues

  • The microphone may have difficulty picking up your speech; if speech recognition results take more than 5 seconds, ask the question again.

Developer Setup

This section describes how to set up the Quest for development with the VHToolkit. While Meta Quest development is possible on MacOS, the main development environment is Windows, using the Meta Horizon Link app. Meta Horizon Link allows you to run VR software from your PC on your headset. VHToolkit development has been tested with the Quest 3.

Requirements

  • Windows OS
  • Unity with Android Build Support, OpenJDK, Android SDK & NDK Tools installed
  • Quest headset set up and connected to your Meta Horizon mobile app
  • Quest headset linked to development PC with USB cable
  • Real world environment scanned and saved as a Space (Settings > Environment setup > Space setup)
  • Internet connection

Installation and Setup

  • Turn on the Quest development mode in your Meta Horizon mobile app:
    • Menu > Devices > Quest 3 > Headset Settings > Developer Mode > Enable Developer Mode
  • Download and install Meta Horizon Link for Windows
  • In Meta Horizon Link:
    • Run through the first time setup steps for your Quest device
    • In Settings > General:
      • Enable Unknown sources
      • Set the OpenXR Runtime to Meta Horizon Link
        • Note, this may require Admin rights
    • In Settings > Developer:
      • Turn on Developer Runtime Features; the VHToolkit sample scene requires:
        • Passthrough over Meta Horizon Link
        • Spatial Data over Meta Horizon Link
  • In the Quest headset, linked to the PC with a USB cable:
    • On the “Allow USB Debugging?” pop-up, click “Always allow from this computer”
      • Note: if this pop-up does not appear, try restarting the headset, and/or the Meta Horizon Link Windows application, and/or re-plugin the USB cable
    • In Settings > Developer, turn on Physical Space Features

Run Example

Initial Unity setup:

  • Get the VHUnityAR project
  • In File > Build Profiles:
    • Set the profile to Android or Meta Quest and switch platform
    • Select your connected headset as the Run Device
      • Note: if your Quest does not show up here, double check all previous steps, primarily the Meta Horizon Link and USB Debugging ones.
  • Open Scenes > SampleScene

There are three ways to run the scene:

  • Play from the Editor, useful for real-time debugging:
    • Hit the Unity Play button at the top; this should launch the Link environment in the headset and then the sample scene
    • The sample starts with a laser pointer from the right controller. Point it at any flat real world surface and the pointer should turn green. Pull the trigger to spawn the character.
    • Troubleshooting:
      • If the Link menu overlays stay visible, click the menu button on the right controller to disable them and the sample scene should get focus and run
      • If the laser pointer remains red, it can't find the real world spatial data. Make sure you have the space scanned and that the Windows Meta Horizon Link app has the "Spatial Data over Meta Horizon Link" option enabled in Settings > Developer.
      • If the laser pointer still remains red, first try restarting the Unity project, as the previous run may not have exited cleanly. If the issue persists, quit the Link environment on the headset and return to the main Quest menu. Then hit Play in the Unity Editor again. Alternatively, first manually load the Link environment from the Quest Quick Settings menu, and then hit Play in the Unity Editor.
  • Build and deploy to Quest headset, useful for testing the app running natively on the device:
    • File > Build Profiles > Build and Run
      • Note: Unity may warn about needing a higher Android API level; if so, click Update Android SDK
    • This will build the APK, deploy it to device, and run it automatically
    • Once deployed, it will stay on the device as any other app. To run it again, select it from the recently launched apps in the main menu, or go to Library > Unknown Sources.
    • Note that every time you rebuild the Unity project through Build and Run, it will automatically overwrite the APK on the device, even if you rename the APK itself
  • Run in Meta XR Simulator, useful for when you don’t have a headset:
    • Click the Meta XR Simulator to set the Play mode to the simulator; the button can be found at the top of the Unity editor, left of the Play button
      • Note: this may require installing a separate package automatically

Notes for Further Development

  • The VHToolkit uses the Meta XR All-in-One SDK:
  • The VHUnityAR Unity project is set up as follows:
    • Flow:
      • ScenePlacementController.cs enables the user to point and click on a surface and then spawns the DemoObjects prefab, which contains all characters, UI, and supporting systems
      • Once DemoObjects is spawned, main control is governed by the DemoControllerAR script, which inherits from DemoControllerBase, where all VH systems are set up
    • Locations:
      • The ScenePlacementController game object can be found in the Hierarchy under XR > [BuildingBlock] Camera Rig > [BuildingBlock] OVRInteractionComprehensive > RightInteractions > Interactors > Controller > ControllerRayInteractor > ControllerPointerPose
      • The character selection buttons are under XR > [BuildingBlock] Camera Rig > [BuildingBlock] OVRInteractionComprehensive > OVRLeftControllerVisual > OVRControllerPrefab > Canvas_CharacterSelection
      • The DemoObjects prefab is not in the hierarchy, but gets spawned; it can be found in Assets\Prefabs
      • All characters are part of the DemoObjects prefab
      • The main XR C# scripts are in Assets\Scripts; shared VH scripts are in Assets\VHShared\Scripts
      • Core VH AI systems are part of the RIDE Cognition and RIDE VH packages
  • For logging and debugging through logs, either use:
    • For when running from the Unity Editor:
      • Enable Android Logcat: top menu > Window > Analysis > Android Logcat
    • For when running on the device:
      • Run ADB from a command prompt; for example, to capture only the logs associated with a Unity proces, and pipe the results to a text file, type: adb logcat -s Unity > adb_log_1.txt
  • For optimization, the following tools are helpful:

Clone this wiki locally