|
| 1 | +# Validating Isaac ROS Visual SLAM on a Robot |
| 2 | + |
| 3 | +## Overview |
| 4 | + |
| 5 | +This document outlines a procedure that can be used to validate the successful installation and configuration of Isaac ROS Visual SLAM on real hardware. |
| 6 | + |
| 7 | +## Table of Contents |
| 8 | +- [Validating Isaac ROS Visual SLAM on a Robot](#validating-isaac-ros-visual-slam-on-a-robot) |
| 9 | + - [Overview](#overview) |
| 10 | + - [Table of Contents](#table-of-contents) |
| 11 | + - [Prerequisites](#prerequisites) |
| 12 | + - [Prepare Physical Environment](#prepare-physical-environment) |
| 13 | + - [Download and Configure Packages](#download-and-configure-packages) |
| 14 | + - [Build ROS Workspace](#build-ros-workspace) |
| 15 | + - [Select and Start ROS Visualization Tool](#select-and-start-ros-visualization-tool) |
| 16 | + - [Run Isaac ROS Visual SLAM and Monitor Results](#run-isaac-ros-visual-slam-and-monitor-results) |
| 17 | + - [Test 1: Visual Odometry](#test-1-visual-odometry) |
| 18 | + - [Test 2: Loop Closure](#test-2-loop-closure) |
| 19 | + - [Next Steps](#next-steps) |
| 20 | + - [Troubleshooting](#troubleshooting) |
| 21 | + - [Integrating Isaac ROS Visual SLAM](#integrating-isaac-ros-visual-slam) |
| 22 | + |
| 23 | +## Prerequisites |
| 24 | + |
| 25 | +- [ ] NVIDIA Jetson |
| 26 | +- [ ] Standalone computer with ROS data visualization tool (RViz, Foxglove, etc.) |
| 27 | +- [ ] Compatible stereo camera |
| 28 | +- [ ] Enough space to navigate (~40m loop) |
| 29 | +- [ ] Tape measure |
| 30 | + |
| 31 | + |
| 32 | +## Prepare Physical Environment |
| 33 | + |
| 34 | +1. Examine the designated testing area and identify a single, non-intersecting loop whose path length is approximately 40 meters. This will be the path along which the camera is moved and tracked using Isaac ROS Visual SLAM. |
| 35 | +2. Define a starting pose along the identified loop. Label this pose with masking tape or a different marker. |
| 36 | + |
| 37 | + One possible approach is to mark an L-shape on the floor, aligning the camera's position and orientation off of the corner of the shape. |
| 38 | +3. Using a measuring tool, measure out a secondary pose along the loop that is exactly 1 meter along the forward axis from the starting pose. Label this pose in the same manner. |
| 39 | + |
| 40 | +<div align="center"><img src="../resources/validation_tape.jpg" width="600px"/></div> |
| 41 | + |
| 42 | +## Download and Configure Packages |
| 43 | + |
| 44 | +1. On each of the Jetson and standalone computer, set up your development environment by following the instructions [here](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_common/blob/main/docs/dev-env-setup.md). |
| 45 | + |
| 46 | + > Note: `${ISAAC_ROS_WS}` is defined to point to either `/ssd/workspaces/isaac_ros-dev/` or `~/workspaces/isaac_ros-dev/`. |
| 47 | +
|
| 48 | +2. Set up your stereo camera and connect it to the Jetson. |
| 49 | + |
| 50 | + - For HAWK cameras, complete the [HAWK setup tutorial](https://github.com/NVIDIA-ISAAC-ROS/.github/blob/main/profile/hawk-setup.md). |
| 51 | + - For RealSense cameras, complete the [RealSense setup tutorial](https://github.com/NVIDIA-ISAAC-ROS/.github/blob/main/profile/realsense-setup.md). |
| 52 | + |
| 53 | +3. On both devices, clone this repository and its dependencies under `${ISAAC_ROS_WS}/src`. |
| 54 | + |
| 55 | + ```bash |
| 56 | + cd ${ISAAC_ROS_WS}/src && \ |
| 57 | + git clone https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam && \ |
| 58 | + git clone https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_common && \ |
| 59 | + git clone https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_nitros |
| 60 | + ``` |
| 61 | + |
| 62 | +## Build ROS Workspace |
| 63 | + |
| 64 | +1. On the Jetson, launch the Docker container in a new terminal: |
| 65 | + |
| 66 | + ```bash |
| 67 | + cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \ |
| 68 | + ./scripts/run_dev.sh ${ISAAC_ROS_WS} |
| 69 | + ``` |
| 70 | + |
| 71 | +2. Inside the container, build and source the workspace: |
| 72 | + |
| 73 | + ```bash |
| 74 | + cd /workspaces/isaac_ros-dev && \ |
| 75 | + colcon build --symlink-install && \ |
| 76 | + source install/setup.bash |
| 77 | + ``` |
| 78 | + |
| 79 | +## Select and Start ROS Visualization Tool |
| 80 | + |
| 81 | +1. Ensure that the standalone computer and Jetson device are on the same network. |
| 82 | +2. On the standalone computer, start RViz from the provided configuration file: |
| 83 | + |
| 84 | + Start the Docker container for RViz on the standalone computer: |
| 85 | + |
| 86 | + ```bash |
| 87 | + cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \ |
| 88 | + ./scripts/run_dev.sh ${ISAAC_ROS_WS} |
| 89 | + ``` |
| 90 | + |
| 91 | + > **(Optional)** Configure the `ROS_DOMAIN_ID` environment variable on both devices to eliminate any cross-talk from other ROS nodes on the same network. |
| 92 | + > |
| 93 | + > In each terminal on each device: |
| 94 | + > ```bash |
| 95 | + > export ROS_DOMAIN_ID=5 # Any number between 1-101; use the same number for each terminal |
| 96 | + > ``` |
| 97 | + > **Note**: This environment variable must be set in each terminal, on both the Jetson and the standalone computer. |
| 98 | + |
| 99 | + From this terminal, run Rviz2 to display the output: |
| 100 | + |
| 101 | + ```bash |
| 102 | + source /workspaces/isaac_ros-dev/install/setup.bash && \ |
| 103 | + rviz2 -d src/isaac_ros_visual_slam/isaac_ros_visual_slam/rviz/default.cfg.rviz |
| 104 | + ``` |
| 105 | + |
| 106 | +3. On the standalone computer, start logging the estimated transform: |
| 107 | + |
| 108 | + Attach another terminal to the running container for `tf2_echo`: |
| 109 | + |
| 110 | + ```bash |
| 111 | + cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \ |
| 112 | + ./scripts/run_dev.sh ${ISAAC_ROS_WS} |
| 113 | + ``` |
| 114 | + |
| 115 | + From this terminal, run `tf2_echo` to display the transform between the `/map` frame and the appropriate camera frame: |
| 116 | + |
| 117 | + - HAWK: `/camera` |
| 118 | + - RealSense: `/camera_link` |
| 119 | + |
| 120 | + ```bash |
| 121 | + source /workspaces/isaac_ros-dev/install/setup.bash && \ |
| 122 | + ros2 run tf2_ros tf2_echo map camera_link |
| 123 | + ``` |
| 124 | + |
| 125 | +## Run Isaac ROS Visual SLAM and Monitor Results |
| 126 | + |
| 127 | +### Test 1: Visual Odometry |
| 128 | + |
| 129 | +The purpose of this test is to validate that the system is able to provide accurate measurements of the robot's local motion. |
| 130 | +
|
| 131 | +1. Initialize the camera at the marked starting pose. |
| 132 | +2. On the Jetson, run the Isaac ROS Visual SLAM launch file appropriate for your camera: |
| 133 | +
|
| 134 | + - HAWK: |
| 135 | + ```bash |
| 136 | + ros2 launch isaac_ros_visual_slam isaac_ros_visual_slam_hawk.launch.py |
| 137 | + ``` |
| 138 | + - RealSense: |
| 139 | + ```bash |
| 140 | + ros2 launch isaac_ros_visual_slam isaac_ros_visual_slam_realsense.launch.py |
| 141 | + ``` |
| 142 | +
|
| 143 | + At this point, the `/map`, `/odom`, and camera frames will all be initialized to the starting pose. |
| 144 | +3. Precisely move the camera straight forward to the marked secondary pose. |
| 145 | +
|
| 146 | +4. On the standalone computer and within the `tf2_echo` terminal, note the latest transform reported between `/map` and the appropriate camera frame. |
| 147 | +
|
| 148 | + <div align="center"><img src="../resources/validation_test1_tf2.png" width="600px"/></div> |
| 149 | +
|
| 150 | + 1. Validate Orientation: |
| 151 | + |
| 152 | + The orientation error between the measured rotation unit quaternion and the expected rotation unit quaternion (`[0 0 0 1]`) should be less than 15 degrees. This error can be calculated using the process explained [here](https://gitlab.tudelft.nl/-/snippets/190): |
| 153 | +
|
| 154 | + $$q_{measured} = w_m + x_m i + y_m j + z_m k$$ |
| 155 | + $$q_{expected} = w_e + x_e i + y_e j + z_e k$$ |
| 156 | + $$2 \arccos (w_m w_e + x_m x_e + y_m y_e + z_m z_e) \leq 15 \degree $$ |
| 157 | +
|
| 158 | + 2. Validate Translation: |
| 159 | +
|
| 160 | + The magnitude of the measured translation should be within 5 centimeters of the expected translation (1 meter): |
| 161 | +
|
| 162 | + $$ 0.95 \text{m} \leq \|t_{measured}\| \leq 1.05 \text{m}$$ |
| 163 | +
|
| 164 | +If all real-world measurements correspond to the reported estimates within the expected tolerance, the Visual Odometry test is a sucess. |
| 165 | +
|
| 166 | +### Test 2: Loop Closure |
| 167 | +
|
| 168 | +The purpose of this test is to validate that the system is able to appropriately realign the estimated pose path to previously-seen features. |
| 169 | +
|
| 170 | +1. Initialize the camera at the marked starting pose. |
| 171 | +2. On the Jetson, run the Isaac ROS Visual SLAM launch file appropriate for your camera: |
| 172 | +
|
| 173 | + - HAWK: |
| 174 | + ```bash |
| 175 | + ros2 launch isaac_ros_visual_slam isaac_ros_visual_slam_hawk.launch.py |
| 176 | + ``` |
| 177 | + - RealSense: |
| 178 | + ```bash |
| 179 | + ros2 launch isaac_ros_visual_slam isaac_ros_visual_slam_realsense.launch.py |
| 180 | + ``` |
| 181 | +
|
| 182 | + At this point, the `/map`, `/odom`, and camera frames will all be initialized to the starting pose. |
| 183 | +
|
| 184 | +3. Precisely move the camera along the predetermined 40m path. |
| 185 | +
|
| 186 | + > **Note**: To maximize the accuracy of this test, ensure that dynamic obstacles are removed from the robot's field of view. If a nontrivial dynamic obstacle is introduced while the robot is performing the loop, please restart this test. |
| 187 | + |
| 188 | +4. As the camera nears the completion of its first lap, place the camera at rest, taking care to precisely align it with the floor marker indicating the starting pose. |
| 189 | +5. On the standalone computer and within the RViz window, ensure that a nontrivial number of visual features now appear red. Additionally, ensure that the camera frame appears nearly coincident with the `/map` frame. |
| 190 | + |
| 191 | + > **Note**: It is acceptable for the `/odom` frame to jump around significantly with respect to the `/map` frame. For the purposes of this test, only the transform between the `/map` and camera frames is relevant. For additional information about the design intent behind these frames, please see [REP-0105](https://www.ros.org/reps/rep-0105.html). |
| 192 | + |
| 193 | + <div align="center"><img src="../resources/validation_test2_rviz.png" width="600px"/></div> |
| 194 | + |
| 195 | +6. On the standalone computer and within the `tf2_echo` terminal, note the latest transform reported between `/map` and the appropriate camera frame. |
| 196 | + <div align="center"><img src="../resources/validation_test2_tf2.png" width="600px"/></div> |
| 197 | + |
| 198 | + 1. Validate Orientation: |
| 199 | + |
| 200 | + The orientation error between the measured rotation unit quaternion and the expected rotation unit quaternion (`[0 0 0 1]`, since the path is a closed loop) should be less than 15 degrees. This error can be calculated using the process explained [here](https://gitlab.tudelft.nl/-/snippets/190): |
| 201 | + |
| 202 | + $$q_{measured} = w_m + x_m i + y_m j + z_m k$$ |
| 203 | + $$q_{expected} = w_e + x_e i + y_e j + z_e k$$ |
| 204 | + $$2 \arccos (w_m w_e + x_m x_e + y_m y_e + z_m z_e) \leq 15 \degree $$ |
| 205 | + |
| 206 | + 2. Validate Translation: |
| 207 | + |
| 208 | + The magnitude of the measured translation should be within 2 meters (5% of the 40m path length) of the expected translation (0 meters, since the path is a closed loop): |
| 209 | + |
| 210 | + $$ 0 \text{m} \leq \|t_{measured}\| \leq 2 \text{m}$$ |
| 211 | + |
| 212 | +If all real-world measurements correspond to the reported estimates within the expected tolerance, the Loop Closure test is a sucess. |
| 213 | + |
| 214 | +## Next Steps |
| 215 | + |
| 216 | +### Troubleshooting |
| 217 | +If any step of the above tests did not succeed, then there may be a configuration error with your system. Please follow these [troubleshooting steps](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam#troubleshooting). |
| 218 | + |
| 219 | +If the prepared troubleshooting steps are unable to resolve the test failure, please contact your NVIDIA support representative and file an issue report with all relevant details. |
| 220 | + |
| 221 | +### Integrating Isaac ROS Visual SLAM |
| 222 | + |
| 223 | +While the Isaac ROS Visual SLAM node exposes a [multitude of interfaces](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam#isaac_ros_visual_slam), the core way to consume the node's output is through the reported transform between the `/map` and camera frames. |
| 224 | +
|
| 225 | +Based on your robot's precise specifications, you can set up a static transform publisher or URDF-based publisher that connects the camera frame to a relevant planning frame on the robot (typically named `/base_link`). Then, your navigation stack can use the transform between `/base_link` and `/map` to plan movements within the local space. |
| 226 | + |
| 227 | +Consider further augmenting your robot's navigation stack with other Isaac ROS packages, such as [Isaac ROS Nvblox](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_nvblox) for 3D reconstruction. |
0 commit comments