Isaac Teleop: The unified standard for high-fidelity egocentric and robot data collection. It is designed to address the data bottleneck in robot learning by streamlining device integration; standardizing high-fidelity human demo data collection; and foster device & data interoperability.
- A single framework that works seamlessly across simulated and real-world robots, ensuring streamlined device workflow and consistent data schemas.
- Currently supported robotics stacks:
- ROS2: Widely adopted middleware for robot software integration and communication
- Isaac Sim: Simulation platform to develop, test, and train AI-powered robots
- Isaac Lab: Unified framework for robot learning designed to help train robot policies
- Upcoming robotics stacks:
- Isaac OS: Enterprise-ready robotics operating system (in early access)
- Isaac Arena: Isaac Lab extension for large-scale evaluation and resource orchestration
- Provides a standardized interface for teleoperation devices, removing the need for custom
device integrations and ongoing maintenance.
- Currently supported device categories:
- XR Headsets (with spatial controllers): Vision Pro, Pico, Quest
- Gloves: Manus
- Foot Pedals (hands-free lower body control): Logitech Rudder Pedal
- Body Trackers: Pico Motion Tracker
- Upcoming device categories:
- Master Manipulators: TBA
- Exoskeletons: TBA
- Currently supported device categories:
- Easily extend support for additional devices through a plugin system, enabling quick integration of new hardware.
- Retarget the standardized device outputs to different embodiments.
- Reference retargeter implementations, including popular embodiments such as Unitree G1.
- Retargeter tuning UI to facilitate live retargeter tuning.
- Currently supported use cases
- Use XR headsets for gripper / tri-finger hand manipulation
- Use XR headsets with gloves for dex-hand manipulation
- Seated full body loco-manipulation (Homie)
- Tracking based full body loco-manipulation (Sonic)
- Upcoming use cases
- Egocentric data collection (aka “no-robot”)
- Teleoperate cloud based robotics simulations
- Remote teleoperation with immersive camera streaming to XR headsets
- CPU: X86 (ARM support coming soon)
- GPU: NVIDIA GPU required
- CPU: AMD Ryzen Threadripper 7960x (Recommended)
- GPU: 1x RTX 6000 Pro (Blackwell) or 2x RTX 6000 (Ada)
- OS: Ubuntu 22.04 or 24.04
- Python: 3.11 or newer (version configured in root
CMakeLists.txt) - CUDA: 12.8 (Recommended)
- NVIDIA Driver: 580.95.05 (Recommended)
-
Request CloudXR SDK Early Access
-
Install Docker by following the public guide
-
Install NGC CLI tool
-
Configure your NGC API key
-
Verify you have access to all the artifacts
ngc registry resource list "nvidia/cloudxr-js-early-access"
ngc registry image list "nvidia/cloudxr-runtime-early-access"
- Install uv (if not already installed):
We strongly recommend using uv for dependency management and Python virtual
environment. Other solution should also work, but your mileage may vary.
curl -LsSf https://astral.sh/uv/install.sh | sh- Create UV virtual environment
uv venv --python 3.11 venv_isaacteleop
source venv_isaacteleop/bin/activate- Clone the repository
git clone git@github.com:NVIDIA/IsaacTeleop.git
cd IsaacTeleopNote: Dependencies (OpenXR SDK, pybind11, yaml-cpp) are automatically downloaded during CMake configuration using FetchContent. No manual dependency installation or git submodule initialization is required.
- (Optional) Pre-download CloudXR Runtime SDK
If you are using the default flow, skip this step and run step 6.
./scripts/run_cloudxr.sh automatically downloads both CloudXR SDKs.
Use this step only if you want to pre-download the Runtime SDK manually:
- Download via script (NGC default):
./scripts/download_cloudxr_runtime_sdk.sh
- Or place local tarball in
deps/cloudxr/:CloudXR-<version>-Linux-<arch>-sdk.tar.gz - Version is controlled by
CXR_RUNTIME_SDK_VERSIONindeps/cloudxr/.env.default.
- (Optional) Pre-download CloudXR Web SDK
If you are using the default flow, skip this step and run step 6.
./scripts/run_cloudxr.sh automatically downloads both CloudXR SDKs.
Use this step only if you want to pre-download the Runtime SDK manually:
- Download via script (NGC default):
./scripts/download_cloudxr_sdk.sh
- Or place local tarball in
deps/cloudxr/:cloudxr-web-sdk-<version>.tar.gz - Version is controlled by
CXR_WEB_SDK_VERSIONindeps/cloudxr/.env.default.
- Run CloudXR
./scripts/run_cloudxr.shThe ./scripts/run_cloudxr.sh script will automatically:
- Download the CloudXR Runtime SDK (if not already present)
- Download the CloudXR Web SDK (if not already present)
- Build the necessary Docker containers (wss-proxy, web-app)
- Start all CloudXR services
Note: The first run may take a few minutes to download the SDK and build containers. Subsequent runs will be faster as these are cached.
- White list ports for Firewall
Open the required CloudXR ports and the web server ports for the WebXR client:
sudo ufw allow 47998/udp && sudo ufw allow 49100/tcp && sudo ufw allow 48322/tcp && sudo ufw allow 8080/tcp && sudo ufw allow 8443/tcp- WebXR Client Setup
The last step will run a couple docker containers and one of them is the WebXR server. It can be accessed via the browser on your HMD support (Quest 3 or Pico 4 Ultra).
- Local:
https://localhost:8443orhttp://localhost:8080 - Network:
https://<server-ip>:8443orhttp://<server-ip>:8080
Tips:
- For rapid development and debugging, we recommend testing your CloudXR.js application on a desktop browser before deploying to XR headsets.
- For Pico 4 Ultra, Pico OS 15.4.4U or later is required.
- HTTP mode is easier to use, but currently is not supported by
You can override CloudXR configurations by creating a .env and place it next
to deps/cloudxr/.env.default. The folder structure should look like:
$ tree -a deps/cloudxr/
deps/cloudxr/
├── CLOUDXR_LICENSE
├── .env
├── .env.default
└── .gitignoreIsaac Tepeop Core is design to work side by side with NVIDIA Isaac Lab. We recommend the Installation using Isaac Sim Pip Package method for Isaac Lab. Please refer to Isaac Lab's Installation guide for other advanced methods. Here are the quick steps to do so.
- Install dependencies
source venv_isaacteleop/bin/activate
uv pip install "isaacsim[all,extscache]==5.1.0" --extra-index-url https://pypi.nvidia.com
uv pip install -U torch==2.7.0 torchvision==0.22.0 --index-url https://download.pytorch.org/whl/cu128- Clone & install Isaac Lab
Run this outside of the IsaacTeleop code base.
# In a separate folder outside of Isaac Teleop:
git clone git@github.com:isaac-sim/IsaacLab.git
# Run the install command
cd IsaacLab
./isaaclab.sh --install
# Set ISAACLAB_PATH, which will be used later in `run_isaac_lab.sh`.
export ISAACLAB_PATH=$(pwd)- Build & install Teleop Python packages
Build with default settings. See BUILD.md for advanced instructions for advanced build steps.
cmake -B build
cmake --build build --parallel
cmake --install buildInstall the Python package
uv pip install --find-links=install/wheels isaacteleopValidate the Python package has been successfully built and installed.
python -c "import isaacteleop.deviceio"Run a quick test:
source scripts/setup_cloudxr_env.sh
python ./examples/oxr/python/test_extensions.py- Run teleoperation with Isaac Lab
# In the IsaacTeleop repo:
./scripts/run_isaac_lab.sh- Build Instructions - CMake build options, troubleshooting
- Contributing Guide - Code style, PR process, DCO
- Core Modules - OXR, DeviceIO, Python bindings
- Retargeting Engine - Hand retargeters
- Teleop Session Manager - Session API
- Manus Gloves - Manus SDK integration
- OAK-D Camera - DepthAI camera plugin
- Synthetic Hands - Controller-based hand simulation
- OpenXR Examples - C++ and Python OpenXR tracking
- LeRobot Integration - Dataset recording/visualization
- Camera Streaming - GStreamer OAK-D pipeline
- Teleop Session - Session API usage
- Dependencies Overview - Dependency management