|
| 1 | +# Tutorial for Visual SLAM using a RealSense camera with integrated IMU |
| 2 | + |
| 3 | +<div align="center"><img src="../resources/realsense.gif" width="600px"/></div> |
| 4 | + |
| 5 | +## Overview |
| 6 | + |
| 7 | +This tutorial walks you through setting up [Isaac ROS Visual SLAM](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam) with a [Realsense camera](https://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html). |
| 8 | + |
| 9 | +> **Note**: The [launch file](../isaac_ros_visual_slam/launch/isaac_ros_visual_slam_realsense.launch.py) provided in this tutorial is designed for a RealSense camera with integrated IMU. If you want to run this tutorial with a RealSense camera without an IMU (like RealSense D435), then change `enable_imu` param in the launch file to `False`. |
| 10 | +<!-- Split blockquote --> |
| 11 | +> **Note**: This tutorial requires a compatible RealSense camera from the list available [here](https://github.com/NVIDIA-ISAAC-ROS/.github/blob/main/profile/realsense-setup.md#camera-compatibility). |
| 12 | +
|
| 13 | +## Tutorial Walkthrough - VSLAM execution |
| 14 | + |
| 15 | +1. Complete the [RealSense setup tutorial](https://github.com/NVIDIA-ISAAC-ROS/.github/blob/main/profile/realsense-setup.md). |
| 16 | + |
| 17 | +2. Complete the [Quickstart section](../README.md#quickstart) in the main README. |
| 18 | + |
| 19 | +3. \[Terminal 1\] Run `realsense-camera` node and `visual_slam` node |
| 20 | + |
| 21 | + Make sure you have your RealSense camera attached to the system, and then start the Isaac ROS container. |
| 22 | + |
| 23 | + ```bash |
| 24 | + isaac_ros_container |
| 25 | + ``` |
| 26 | + |
| 27 | + > Or if you did not add the command in [step 1-3 of the quickstart section](../README.md#quickstart): |
| 28 | + > |
| 29 | + > ```bash |
| 30 | + > cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \ |
| 31 | + > ./scripts/run_dev.sh ${ISAAC_ROS_WS} |
| 32 | + > ``` |
| 33 | + |
| 34 | +4. \[Terminal 1\] Inside the container, build and source the workspace: |
| 35 | + |
| 36 | + ```bash |
| 37 | + cd /workspaces/isaac_ros-dev && \ |
| 38 | + colcon build --symlink-install && \ |
| 39 | + source install/setup.bash |
| 40 | + ``` |
| 41 | + |
| 42 | +5. \[Terminal 1\] Run the launch file, which launches the example and wait for 5 seconds: |
| 43 | + |
| 44 | + ```bash |
| 45 | + ros2 launch isaac_ros_visual_slam isaac_ros_visual_slam_realsense.launch.py |
| 46 | + ``` |
| 47 | + |
| 48 | +6. \[Terminal 2\] Attach a second terminal to check the operation. |
| 49 | + |
| 50 | + Attach another terminal to the running container for issuing other ROS2 commands. |
| 51 | + |
| 52 | + ```bash |
| 53 | + isaac_ros_container |
| 54 | + ``` |
| 55 | + |
| 56 | + First check if you can see all the ROS2 topics expected. |
| 57 | + |
| 58 | + ```bash |
| 59 | + ros2 topic list |
| 60 | + ``` |
| 61 | + |
| 62 | + > Output example: |
| 63 | + > |
| 64 | + > ```bash |
| 65 | + > /camera/accel/imu_info |
| 66 | + > /camera/accel/metadata |
| 67 | + > /camera/accel/sample |
| 68 | + > /camera/extrinsics/depth_to_accel |
| 69 | + > /camera/extrinsics/depth_to_gyro |
| 70 | + > /camera/extrinsics/depth_to_infra1 |
| 71 | + > /camera/extrinsics/depth_to_infra2 |
| 72 | + > /camera/gyro/imu_info |
| 73 | + > /camera/gyro/metadata |
| 74 | + > /camera/gyro/sample |
| 75 | + > /camera/imu |
| 76 | + > /camera/infra1/camera_info |
| 77 | + > /camera/infra1/image_rect_raw |
| 78 | + > /camera/infra1/image_rect_raw/compressed |
| 79 | + > /camera/infra1/image_rect_raw/compressedDepth |
| 80 | + > /camera/infra1/image_rect_raw/theora |
| 81 | + > /camera/infra1/metadata |
| 82 | + > /camera/infra2/camera_info |
| 83 | + > /camera/infra2/image_rect_raw |
| 84 | + > /camera/infra2/image_rect_raw/compressed |
| 85 | + > /camera/infra2/image_rect_raw/compressedDepth |
| 86 | + > /camera/infra2/image_rect_raw/theora |
| 87 | + > /camera/infra2/metadata |
| 88 | + > /parameter_events |
| 89 | + > /rosout |
| 90 | + > /tf |
| 91 | + > /tf_static |
| 92 | + > /visual_slam/imu |
| 93 | + > /visual_slam/status |
| 94 | + > /visual_slam/tracking/odometry |
| 95 | + > /visual_slam/tracking/slam_path |
| 96 | + > /visual_slam/tracking/vo_path |
| 97 | + > /visual_slam/tracking/vo_pose |
| 98 | + > /visual_slam/tracking/vo_pose_covariance |
| 99 | + > /visual_slam/vis/gravity |
| 100 | + > /visual_slam/vis/landmarks_cloud |
| 101 | + > /visual_slam/vis/localizer |
| 102 | + > /visual_slam/vis/localizer_loop_closure_cloud |
| 103 | + > /visual_slam/vis/localizer_map_cloud |
| 104 | + > /visual_slam/vis/localizer_observations_cloud |
| 105 | + > /visual_slam/vis/loop_closure_cloud |
| 106 | + > /visual_slam/vis/observations_cloud |
| 107 | + > /visual_slam/vis/pose_graph_edges |
| 108 | + > /visual_slam/vis/pose_graph_edges2 |
| 109 | + > /visual_slam/vis/pose_graph_nodes |
| 110 | + > /visual_slam/vis/velocity |
| 111 | + > ``` |
| 112 | + |
| 113 | + Check the frequency of the `realsense-camera` node's output frequency. |
| 114 | +
|
| 115 | + ```bash |
| 116 | + ros2 topic hz /camera/infra1/image_rect_raw --window 20 |
| 117 | + ``` |
| 118 | +
|
| 119 | + > Example output: |
| 120 | + > |
| 121 | + > ```bash |
| 122 | + > average rate: 89.714 |
| 123 | + > min: 0.011s max: 0.011s std dev: 0.00025s window: 20 |
| 124 | + > average rate: 90.139 |
| 125 | + > min: 0.010s max: 0.012s std dev: 0.00038s window: 20 |
| 126 | + > average rate: 89.955 |
| 127 | + > min: 0.011s max: 0.011s std dev: 0.00020s window: 20 |
| 128 | + > average rate: 89.761 |
| 129 | + > min: 0.009s max: 0.013s std dev: 0.00074s window: 20 |
| 130 | + > ``` |
| 131 | + > |
| 132 | + > `Ctrl` + `c` to stop the output. |
| 133 | +
|
| 134 | + You can also check the frequency of IMU topic. |
| 135 | +
|
| 136 | + ```bash |
| 137 | + ros2 topic hz /camera/imu --window 20 |
| 138 | + ``` |
| 139 | +
|
| 140 | + > Example output: |
| 141 | + > |
| 142 | + > ```bash |
| 143 | + > average rate: 199.411 |
| 144 | + > min: 0.004s max: 0.006s std dev: 0.00022s window: 20 |
| 145 | + > average rate: 199.312 |
| 146 | + > min: 0.004s max: 0.006s std dev: 0.00053s window: 20 |
| 147 | + > average rate: 200.409 |
| 148 | + > min: 0.005s max: 0.005s std dev: 0.00007s window: 20 |
| 149 | + > average rate: 200.173 |
| 150 | + > min: 0.004s max: 0.006s std dev: 0.00028s window: 20 |
| 151 | + > ``` |
| 152 | +
|
| 153 | + Lastly, check if you are getting the output from the `visual_slam` node at the same rate as the input. |
| 154 | +
|
| 155 | + ```bash |
| 156 | + ros2 topic hz /visual_slam/tracking/odometry --window 20 |
| 157 | + ``` |
| 158 | +
|
| 159 | + > Example output |
| 160 | + > |
| 161 | + > ```bash |
| 162 | + > average rate: 58.086 |
| 163 | + > min: 0.002s max: 0.107s std dev: 0.03099s window: 20 |
| 164 | + > average rate: 62.370 |
| 165 | + > min: 0.001s max: 0.109s std dev: 0.03158s window: 20 |
| 166 | + > average rate: 90.559 |
| 167 | + > min: 0.009s max: 0.013s std dev: 0.00066s window: 20 |
| 168 | + > average rate: 85.612 |
| 169 | + > min: 0.002s max: 0.100s std dev: 0.02079s window: 20 |
| 170 | + > average rate: 90.032 |
| 171 | + > min: 0.010s max: 0.013s std dev: 0.00059s window: 20 |
| 172 | + > ``` |
| 173 | +
|
| 174 | +## Tutorial Walkthrough - Visualization |
| 175 | +
|
| 176 | +At this point, you have two options for checking the `visual_slam` output. |
| 177 | +
|
| 178 | +- **Live visualization**: Run Rviz2 live while running `realsense-camera` node and `visual_slam` nodes. |
| 179 | +- **Offline visualization**: Record rosbag file and check the recorded data offline (possibly on a different machine) |
| 180 | +
|
| 181 | +Running Rviz2 on a remote PC over the network is tricky and is very difficult especially when you have image message topics to subscribe due to added burden on ROS 2 network transport. |
| 182 | +
|
| 183 | +Working on Rviz2 in a X11-forwarded window is also difficult because of the network speed limitation. |
| 184 | +
|
| 185 | +Therefore, if you are running `visual_slam` on Jetson, it is generally recommended **NOT** to evaluate with live visualization (1). |
| 186 | +
|
| 187 | +### Live visualization |
| 188 | +
|
| 189 | +1. \[Terminal 2\] Open Rviz2 from the second terminal: |
| 190 | +
|
| 191 | + ```bash |
| 192 | + rviz2 -d src/isaac_ros_visual_slam/isaac_ros_visual_slam/rviz/realsense.cfg.rviz |
| 193 | + ``` |
| 194 | +
|
| 195 | + As you move the camera, the position and orientation of the frames should correspond to how the camera moved relative to its starting pose. |
| 196 | + <div align="center"><img src="../resources/realsense.gif" width="600px"/></div> |
| 197 | +
|
| 198 | +### Offline visualization |
| 199 | +
|
| 200 | +1. \[Terminal 2\] Save rosbag file |
| 201 | +
|
| 202 | + Record the output in your rosbag file (along with the input data for later visual inspection). |
| 203 | +
|
| 204 | + ```bash |
| 205 | + export ROSBAG_NAME=courtyard-d435i |
| 206 | + ros2 bag record -o ${ROSBAG_NAME} \ |
| 207 | + /camera/imu /camera/accel/metadata /camera/gyro/metadata \ |
| 208 | + /camera/infra1/camera_info /camera/infra1/image_rect_raw \ |
| 209 | + /camera/infra1/metadata \ |
| 210 | + /camera/infra2/camera_info /camera/infra2/image_rect_raw \ |
| 211 | + /camera/infra2/metadata \ |
| 212 | + /tf_static /tf \ |
| 213 | + /visual_slam/status \ |
| 214 | + /visual_slam/tracking/odometry \ |
| 215 | + /visual_slam/tracking/slam_path /visual_slam/tracking/vo_path \ |
| 216 | + /visual_slam/tracking/vo_pose /visual_slam/tracking/vo_pose_covariance \ |
| 217 | + /visual_slam/vis/landmarks_cloud /visual_slam/vis/loop_closure_cloud \ |
| 218 | + /visual_slam/vis/observations_cloud \ |
| 219 | + /visual_slam/vis/pose_graph_edges /visual_slam/vis/pose_graph_edges2 \ |
| 220 | + /visual_slam/vis/pose_graph_nodes |
| 221 | + ros2 bag info ${ROSBAG_NAME} |
| 222 | + ``` |
| 223 | +
|
| 224 | + If you plan to run the rosbag on a remote machine (PC) for evaluation, you can send the rosbag file to your remote machine. |
| 225 | +
|
| 226 | + ```bash |
| 227 | + export IP_PC=192.168.1.100 |
| 228 | + scp -r ${ROSBAG_NAME} ${PC_USER}@${IP_PC}:/home/${PC_USER}/workspaces/isaac_ros-dev/ |
| 229 | + ``` |
| 230 | +
|
| 231 | +2. \[Terminal A\] Launch Rviz2 |
| 232 | +
|
| 233 | + > If you are SSH-ing into Jetson from your PC, make sure you enabled X forwarding by adding `-X` option with SSH command. |
| 234 | + > |
| 235 | + > ```bash |
| 236 | + > ssh -X ${USERNAME_ON_JETSON}@${IP_JETSON} |
| 237 | + > ``` |
| 238 | +
|
| 239 | + Launch the Isaac ROS container. |
| 240 | +
|
| 241 | + ```bash |
| 242 | + isaac_ros_container |
| 243 | + ``` |
| 244 | +
|
| 245 | + Run Rviz with a configuration file for visualizing set of messages from Visual SLAM node. |
| 246 | +
|
| 247 | + ```bash |
| 248 | + cd /workspaces/isaac_ros-dev |
| 249 | + rviz2 -d src/isaac_ros_visual_slam/isaac_ros_visual_slam/rviz/vslam_keepall.cfg.rviz |
| 250 | + ``` |
| 251 | +
|
| 252 | +3. \[Terminal B\] Playback the recorded rosbag |
| 253 | +
|
| 254 | + Attach another terminal to the running container. |
| 255 | +
|
| 256 | + ```bash |
| 257 | + isaac_ros_container |
| 258 | + ``` |
| 259 | +
|
| 260 | + Play the recorded rosbag file. |
| 261 | +
|
| 262 | + ```bash |
| 263 | + ros2 bag play ${ROSBAG_NAME} |
| 264 | + ``` |
| 265 | +
|
| 266 | + RViz should start showing visualization like the following. |
| 267 | + <div align="center"><img src="../resources/RViz_0217-cube_vslam-keepall.png" width="600px"/></div> |
0 commit comments