Skip to content

Commit 90f8890

Browse files
Merge pull request #79 from NVIDIA-ISAAC-ROS/release-dp3
Isaac ROS 0.30.0 (DP3)
2 parents 711e844 + 0e81a94 commit 90f8890

30 files changed

+2003
-176
lines changed

.gitattributes

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
# Ignore Python files in linguist
2+
*.py linguist-detectable=false
3+
14
# Images
25
*.gif filter=lfs diff=lfs merge=lfs -text
36
*.jpg filter=lfs diff=lfs merge=lfs -text

README.md

Lines changed: 108 additions & 66 deletions
Large diffs are not rendered by default.

docs/elbrus-slam.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,15 @@ At this moment, a connection is added to the PoseGraph which makes a loop from t
1818

1919
The procedure for adding landmarks is designed such that if we do not see a landmark in the place where it was expected, then such landmarks are marked for eventual deletion. This allows you to use Elbrus over a changing terrain.
2020

21+
Along with visual data, Elbrus can use Inertial Measurement Unit (IMU) measurements. It automatically switches to IMU when VO is unable to estimate a pose – for example, when there is dark lighting or long solid surfaces in front of a camera. Using an IMU usually leads to a significant performance improvement in cases of poor visual conditions.
22+
23+
In case of severe degradation of image input (lights being turned off, dramatic motion blur on a bump while driving, and other possible scenarios), below mentioned motion estimation algorithms will ensure acceptable quality for pose tracking:
24+
25+
* The IMU readings integrator provides acceptable pose tracking quality for about ~< 1 seconds.
26+
27+
* In case of IMU failure, the constant velocity integrator continues to provide the last linear and angular velocities reported by Stereo VIO before failure.
28+
This provides acceptable pose tracking quality for ~0.5 seconds.
29+
2130
## List of Useful Visualizations
2231

2332
* `visual_slam/vis/observations_cloud` - Point cloud for 2D Features
@@ -28,10 +37,10 @@ The procedure for adding landmarks is designed such that if we do not see a land
2837

2938
## Saving the map
3039

31-
Naturally, we would like to save the stored landmarks and pose graph in a map. We have implemented a ROS2 action to save the map to the disk called `SaveMap`.
40+
Naturally, we would like to save the stored landmarks and pose graph in a map. We have implemented a ROS 2 action to save the map to the disk called `SaveMap`.
3241

3342
## Loading and Localization in the Map
3443

35-
Once the map has been saved to the disk, it can be used later to localize the robot. To load the map into the memory, we have made a ROS2 action called `LoadMapAndLocalize`. It requires a map file path and a prior pose, which is an initial guess of where the robot is in the map. Given the prior pose and current set of camera frames, Elbrus tries to find the pose of the landmarks in the requested map that matches the current set. If the localization is successful, Elbrus will load the map in the memory. Otherwise, it will continue building a new map.
44+
Once the map has been saved to the disk, it can be used later to localize the robot. To load the map into the memory, we have made a ROS 2 action called `LoadMapAndLocalize`. It requires a map file path and a prior pose, which is an initial guess of where the robot is in the map. Given the prior pose and current set of camera frames, Elbrus tries to find the pose of the landmarks in the requested map that matches the current set. If the localization is successful, Elbrus will load the map in the memory. Otherwise, it will continue building a new map.
3645

3746
Both `SaveMap` and `LoadMapAndLocalize` can take some time to complete. Hence, they are designed to be asynchronous to avoid interfering with odometry calculations.

docs/tutorial-isaac-sim.md

Lines changed: 16 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,9 @@
44

55
## Overview
66

7-
This tutorial walks you through a pipeline to estimate 3D pose of the camera with [Visual SLAM](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam) using images from Isaac Sim.
7+
This tutorial walks you through a graph to estimate 3D pose of the camera with [Visual SLAM](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam) using images from Isaac Sim.
8+
9+
Last validated with [Isaac Sim 2022.2.1](https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/release_notes.html#id1)
810

911
## Tutorial Walkthrough
1012

@@ -25,19 +27,19 @@ This tutorial walks you through a pipeline to estimate 3D pose of the camera wit
2527
```
2628

2729
4. Install and launch Isaac Sim following the steps in the [Isaac ROS Isaac Sim Setup Guide](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_common/blob/main/docs/isaac-sim-sil-setup.md)
28-
5. Open up the Isaac ROS Common USD scene (using the "content" window) located at:
30+
5. Open up the Isaac ROS Common USD scene (using the *Content* tab) located at:
2931

30-
`omniverse://localhost/NVIDIA/Assets/Isaac/2022.1/Isaac/Samples/ROS2/Scenario/carter_warehouse_apriltags_worker.usd`.
32+
```text
33+
http://omniverse-content-production.s3-us-west-2.amazonaws.com/Assets/Isaac/2022.2.1/Isaac/Samples/ROS2/Scenario/carter_warehouse_apriltags_worker.usd
34+
```
3135

3236
And wait for it to load completely.
33-
> **Note:** To use a different server, replace `localhost` with `<your_nucleus_server>`
34-
6. Go to the stage tab and select `/World/Carter_ROS/ROS_Cameras/ros2_create_camera_right_info`, then in properties tab -> Compute Node -> Inputs -> stereoOffset X change `0` to `-175.92`.
37+
6. Go to the *Stage* tab and select `/World/Carter_ROS/ROS_Cameras/ros2_create_camera_right_info`, then in *Property* tab *-> OmniGraph Node -> Inputs -> stereoOffset X* change `0` to `-175.92`.
3538
<div align="center"><img src="../resources/Isaac_sim_set_stereo_offset.png" width="500px"/></div>
36-
37-
7. Enable the right camera for a stereo image pair. Go to the stage tab and select `/World/Carter_ROS/ROS_Cameras/enable_camera_right`, then tick the `Condition` checkbox.
39+
7. Enable the right camera for a stereo image pair. Go to the *Stage* tab and select `/World/Carter_ROS/ROS_Cameras/enable_camera_right`, then tick the *Condition* checkbox.
3840
<div align="center"><img src="../resources/Isaac_sim_enable_stereo.png" width="500px"/></div>
3941
8. Press **Play** to start publishing data from the Isaac Sim application.
40-
<div align="center"><img src="../resources/Isaac_sim_visual_slam.png" width="800px"/></div>
42+
<div align="center"><img src="../resources/Isaac_sim_play.png" width="800px"/></div>
4143
9. In a separate terminal, start `isaac_ros_visual_slam` using the launch files:
4244

4345
```bash
@@ -67,17 +69,17 @@ This tutorial walks you through a pipeline to estimate 3D pose of the camera wit
6769

6870
## Saving and using the map
6971

70-
As soon as you start the visual SLAM node, it starts storing the landmarks and the pose graph. You can save them in a map and store the map onto a disk. Make a call to the `SaveMap` ROS2 Action with the following command:
72+
As soon as you start the visual SLAM node, it starts storing the landmarks and the pose graph. You can save them in a map and store the map onto a disk. Make a call to the `SaveMap` ROS 2 Action with the following command:
7173

7274
```bash
7375
ros2 action send_goal /visual_slam/save_map isaac_ros_visual_slam_interfaces/action/SaveMap "{map_url: /path/to/save/the/map}"
7476
```
7577

76-
</br>
78+
<br>
7779
<div align="center"><img src="../resources/Save_map.png" width="400px"/></div>
78-
</br>
80+
<br>
7981
<div align="center"><img src="../resources/RViz_isaac_sim_mapping.png" width="800px" alt="Sample run before saving the map" title="Sample run before saving the map."/></div>
80-
</br>
82+
<br>
8183

8284
Now, you will try to load and localize in the previously saved map. First, stop the `visual_slam` node lauched for creating and saving the map, then relaunch it.
8385

@@ -88,7 +90,7 @@ ros2 action send_goal /visual_slam/load_map_and_localize isaac_ros_visual_slam_i
8890
```
8991

9092
<div align="center"><img src="../resources/Load_and_localize.png" width="400px"/></div>
91-
</br>
93+
<br>
9294

9395
Once the above step returns success, you have successfully loaded and localized your robot in the map. If it results in failure, there might be a possibilty that the current landmarks from the approximate start location are not matching with stored landmarks and you need to provide another valid value.
9496

@@ -98,7 +100,7 @@ Once the above step returns success, you have successfully loaded and localized
98100
<figcaption>Before Localization</figcaption>
99101
</figure>
100102
</div>
101-
</br>
103+
<br>
102104
<div align="center">
103105
<figure class="image">
104106
<img src="../resources/After_localization.png" width="600px" alt="After localization" title="After localization."/>

docs/tutorial-realsense.md

Lines changed: 267 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,267 @@
1+
# Tutorial for Visual SLAM using a RealSense camera with integrated IMU
2+
3+
<div align="center"><img src="../resources/realsense.gif" width="600px"/></div>
4+
5+
## Overview
6+
7+
This tutorial walks you through setting up [Isaac ROS Visual SLAM](https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam) with a [Realsense camera](https://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html).
8+
9+
> **Note**: The [launch file](../isaac_ros_visual_slam/launch/isaac_ros_visual_slam_realsense.launch.py) provided in this tutorial is designed for a RealSense camera with integrated IMU. If you want to run this tutorial with a RealSense camera without an IMU (like RealSense D435), then change `enable_imu` param in the launch file to `False`.
10+
<!-- Split blockquote -->
11+
> **Note**: This tutorial requires a compatible RealSense camera from the list available [here](https://github.com/NVIDIA-ISAAC-ROS/.github/blob/main/profile/realsense-setup.md#camera-compatibility).
12+
13+
## Tutorial Walkthrough - VSLAM execution
14+
15+
1. Complete the [RealSense setup tutorial](https://github.com/NVIDIA-ISAAC-ROS/.github/blob/main/profile/realsense-setup.md).
16+
17+
2. Complete the [Quickstart section](../README.md#quickstart) in the main README.
18+
19+
3. \[Terminal 1\] Run `realsense-camera` node and `visual_slam` node
20+
21+
Make sure you have your RealSense camera attached to the system, and then start the Isaac ROS container.
22+
23+
```bash
24+
isaac_ros_container
25+
```
26+
27+
> Or if you did not add the command in [step 1-3 of the quickstart section](../README.md#quickstart):
28+
>
29+
> ```bash
30+
> cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \
31+
> ./scripts/run_dev.sh ${ISAAC_ROS_WS}
32+
> ```
33+
34+
4. \[Terminal 1\] Inside the container, build and source the workspace:
35+
36+
```bash
37+
cd /workspaces/isaac_ros-dev && \
38+
colcon build --symlink-install && \
39+
source install/setup.bash
40+
```
41+
42+
5. \[Terminal 1\] Run the launch file, which launches the example and wait for 5 seconds:
43+
44+
```bash
45+
ros2 launch isaac_ros_visual_slam isaac_ros_visual_slam_realsense.launch.py
46+
```
47+
48+
6. \[Terminal 2\] Attach a second terminal to check the operation.
49+
50+
Attach another terminal to the running container for issuing other ROS2 commands.
51+
52+
```bash
53+
isaac_ros_container
54+
```
55+
56+
First check if you can see all the ROS2 topics expected.
57+
58+
```bash
59+
ros2 topic list
60+
```
61+
62+
> Output example:
63+
>
64+
> ```bash
65+
> /camera/accel/imu_info
66+
> /camera/accel/metadata
67+
> /camera/accel/sample
68+
> /camera/extrinsics/depth_to_accel
69+
> /camera/extrinsics/depth_to_gyro
70+
> /camera/extrinsics/depth_to_infra1
71+
> /camera/extrinsics/depth_to_infra2
72+
> /camera/gyro/imu_info
73+
> /camera/gyro/metadata
74+
> /camera/gyro/sample
75+
> /camera/imu
76+
> /camera/infra1/camera_info
77+
> /camera/infra1/image_rect_raw
78+
> /camera/infra1/image_rect_raw/compressed
79+
> /camera/infra1/image_rect_raw/compressedDepth
80+
> /camera/infra1/image_rect_raw/theora
81+
> /camera/infra1/metadata
82+
> /camera/infra2/camera_info
83+
> /camera/infra2/image_rect_raw
84+
> /camera/infra2/image_rect_raw/compressed
85+
> /camera/infra2/image_rect_raw/compressedDepth
86+
> /camera/infra2/image_rect_raw/theora
87+
> /camera/infra2/metadata
88+
> /parameter_events
89+
> /rosout
90+
> /tf
91+
> /tf_static
92+
> /visual_slam/imu
93+
> /visual_slam/status
94+
> /visual_slam/tracking/odometry
95+
> /visual_slam/tracking/slam_path
96+
> /visual_slam/tracking/vo_path
97+
> /visual_slam/tracking/vo_pose
98+
> /visual_slam/tracking/vo_pose_covariance
99+
> /visual_slam/vis/gravity
100+
> /visual_slam/vis/landmarks_cloud
101+
> /visual_slam/vis/localizer
102+
> /visual_slam/vis/localizer_loop_closure_cloud
103+
> /visual_slam/vis/localizer_map_cloud
104+
> /visual_slam/vis/localizer_observations_cloud
105+
> /visual_slam/vis/loop_closure_cloud
106+
> /visual_slam/vis/observations_cloud
107+
> /visual_slam/vis/pose_graph_edges
108+
> /visual_slam/vis/pose_graph_edges2
109+
> /visual_slam/vis/pose_graph_nodes
110+
> /visual_slam/vis/velocity
111+
> ```
112+
113+
Check the frequency of the `realsense-camera` node's output frequency.
114+
115+
```bash
116+
ros2 topic hz /camera/infra1/image_rect_raw --window 20
117+
```
118+
119+
> Example output:
120+
>
121+
> ```bash
122+
> average rate: 89.714
123+
> min: 0.011s max: 0.011s std dev: 0.00025s window: 20
124+
> average rate: 90.139
125+
> min: 0.010s max: 0.012s std dev: 0.00038s window: 20
126+
> average rate: 89.955
127+
> min: 0.011s max: 0.011s std dev: 0.00020s window: 20
128+
> average rate: 89.761
129+
> min: 0.009s max: 0.013s std dev: 0.00074s window: 20
130+
> ```
131+
>
132+
> `Ctrl` + `c` to stop the output.
133+
134+
You can also check the frequency of IMU topic.
135+
136+
```bash
137+
ros2 topic hz /camera/imu --window 20
138+
```
139+
140+
> Example output:
141+
>
142+
> ```bash
143+
> average rate: 199.411
144+
> min: 0.004s max: 0.006s std dev: 0.00022s window: 20
145+
> average rate: 199.312
146+
> min: 0.004s max: 0.006s std dev: 0.00053s window: 20
147+
> average rate: 200.409
148+
> min: 0.005s max: 0.005s std dev: 0.00007s window: 20
149+
> average rate: 200.173
150+
> min: 0.004s max: 0.006s std dev: 0.00028s window: 20
151+
> ```
152+
153+
Lastly, check if you are getting the output from the `visual_slam` node at the same rate as the input.
154+
155+
```bash
156+
ros2 topic hz /visual_slam/tracking/odometry --window 20
157+
```
158+
159+
> Example output
160+
>
161+
> ```bash
162+
> average rate: 58.086
163+
> min: 0.002s max: 0.107s std dev: 0.03099s window: 20
164+
> average rate: 62.370
165+
> min: 0.001s max: 0.109s std dev: 0.03158s window: 20
166+
> average rate: 90.559
167+
> min: 0.009s max: 0.013s std dev: 0.00066s window: 20
168+
> average rate: 85.612
169+
> min: 0.002s max: 0.100s std dev: 0.02079s window: 20
170+
> average rate: 90.032
171+
> min: 0.010s max: 0.013s std dev: 0.00059s window: 20
172+
> ```
173+
174+
## Tutorial Walkthrough - Visualization
175+
176+
At this point, you have two options for checking the `visual_slam` output.
177+
178+
- **Live visualization**: Run Rviz2 live while running `realsense-camera` node and `visual_slam` nodes.
179+
- **Offline visualization**: Record rosbag file and check the recorded data offline (possibly on a different machine)
180+
181+
Running Rviz2 on a remote PC over the network is tricky and is very difficult especially when you have image message topics to subscribe due to added burden on ROS 2 network transport.
182+
183+
Working on Rviz2 in a X11-forwarded window is also difficult because of the network speed limitation.
184+
185+
Therefore, if you are running `visual_slam` on Jetson, it is generally recommended **NOT** to evaluate with live visualization (1).
186+
187+
### Live visualization
188+
189+
1. \[Terminal 2\] Open Rviz2 from the second terminal:
190+
191+
```bash
192+
rviz2 -d src/isaac_ros_visual_slam/isaac_ros_visual_slam/rviz/realsense.cfg.rviz
193+
```
194+
195+
As you move the camera, the position and orientation of the frames should correspond to how the camera moved relative to its starting pose.
196+
<div align="center"><img src="../resources/realsense.gif" width="600px"/></div>
197+
198+
### Offline visualization
199+
200+
1. \[Terminal 2\] Save rosbag file
201+
202+
Record the output in your rosbag file (along with the input data for later visual inspection).
203+
204+
```bash
205+
export ROSBAG_NAME=courtyard-d435i
206+
ros2 bag record -o ${ROSBAG_NAME} \
207+
/camera/imu /camera/accel/metadata /camera/gyro/metadata \
208+
/camera/infra1/camera_info /camera/infra1/image_rect_raw \
209+
/camera/infra1/metadata \
210+
/camera/infra2/camera_info /camera/infra2/image_rect_raw \
211+
/camera/infra2/metadata \
212+
/tf_static /tf \
213+
/visual_slam/status \
214+
/visual_slam/tracking/odometry \
215+
/visual_slam/tracking/slam_path /visual_slam/tracking/vo_path \
216+
/visual_slam/tracking/vo_pose /visual_slam/tracking/vo_pose_covariance \
217+
/visual_slam/vis/landmarks_cloud /visual_slam/vis/loop_closure_cloud \
218+
/visual_slam/vis/observations_cloud \
219+
/visual_slam/vis/pose_graph_edges /visual_slam/vis/pose_graph_edges2 \
220+
/visual_slam/vis/pose_graph_nodes
221+
ros2 bag info ${ROSBAG_NAME}
222+
```
223+
224+
If you plan to run the rosbag on a remote machine (PC) for evaluation, you can send the rosbag file to your remote machine.
225+
226+
```bash
227+
export IP_PC=192.168.1.100
228+
scp -r ${ROSBAG_NAME} ${PC_USER}@${IP_PC}:/home/${PC_USER}/workspaces/isaac_ros-dev/
229+
```
230+
231+
2. \[Terminal A\] Launch Rviz2
232+
233+
> If you are SSH-ing into Jetson from your PC, make sure you enabled X forwarding by adding `-X` option with SSH command.
234+
>
235+
> ```bash
236+
> ssh -X ${USERNAME_ON_JETSON}@${IP_JETSON}
237+
> ```
238+
239+
Launch the Isaac ROS container.
240+
241+
```bash
242+
isaac_ros_container
243+
```
244+
245+
Run Rviz with a configuration file for visualizing set of messages from Visual SLAM node.
246+
247+
```bash
248+
cd /workspaces/isaac_ros-dev
249+
rviz2 -d src/isaac_ros_visual_slam/isaac_ros_visual_slam/rviz/vslam_keepall.cfg.rviz
250+
```
251+
252+
3. \[Terminal B\] Playback the recorded rosbag
253+
254+
Attach another terminal to the running container.
255+
256+
```bash
257+
isaac_ros_container
258+
```
259+
260+
Play the recorded rosbag file.
261+
262+
```bash
263+
ros2 bag play ${ROSBAG_NAME}
264+
```
265+
266+
RViz should start showing visualization like the following.
267+
<div align="center"><img src="../resources/RViz_0217-cube_vslam-keepall.png" width="600px"/></div>

0 commit comments

Comments
 (0)