11# Isaac ROS Visual SLAM
22
3- Hardware-accelerated, simultaneous localization and mapping (SLAM) using stereo visual inertial odometry (SVIO).
3+ NVIDIA-accelerated, simultaneous localization and mapping (SLAM) using stereo
4+ visual inertial odometry (SVIO).
45
56<div align =" center " ><a class =" reference internal image-reference " href =" https://media.githubusercontent.com/media/NVIDIA-ISAAC-ROS/.github/main/resources/isaac_ros_docs/repositories_and_packages/isaac_ros_visual_slam/cuvslam_ros_3.gif/ " ><img alt =" image " src =" https://media.githubusercontent.com/media/NVIDIA-ISAAC-ROS/.github/main/resources/isaac_ros_docs/repositories_and_packages/isaac_ros_visual_slam/cuvslam_ros_3.gif/ " width =" 800px " /></a ></div >
67
@@ -16,32 +17,32 @@ Jetson](https://gateway.on24.com/wcc/experience/elitenvidiabrill/1407606/3998202
1617
1718## Overview
1819
19- [ Isaac ROS Visual SLAM] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam ) provides a high-performance, best-in-class ROS 2 package
20+ [ Isaac ROS Visual SLAM] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_slam )
21+ provides a high-performance, best-in-class ROS 2 package
2022for VSLAM (visual simultaneous localization and mapping). This package
21- uses a stereo camera with an IMU to estimate odometry as an input to
22- navigation. It is GPU accelerated to provide real-time, low-latency
23- results in a robotics application. VSLAM provides an additional odometry
24- source for mobile robots (ground based) and can be the primary odometry
25- source for drones.
23+ uses one or more stereo cameras and optionally an IMU to estimate
24+ odometry as an input to navigation. It is GPU accelerated to provide
25+ real-time, low-latency results in a robotics application. VSLAM
26+ provides an additional odometry source for mobile robots
27+ (ground based) and can be the primary odometry source for drones.
2628
2729VSLAM provides a method for visually estimating the position of a robot
2830relative to its start position, known as VO (visual odometry). This is
2931particularly useful in environments where GPS is not available (such as
3032indoors) or intermittent (such as urban locations with structures
3133blocking line of sight to GPS satellites). This method is designed to
32- use left and right stereo camera frames and an IMU (inertial measurement
33- unit) as input. It uses input stereo image pairs to find matching key
34- points in the left and right images; using the baseline between the left
35- and right camera, it can estimate the distance to the key point. Using
36- two consecutive input stereo image pairs, VSLAM can track the 3D motion
37- of key points between the two consecutive images to estimate the 3D
38- motion of the camera-which is then used to compute odometry as an output
34+ use multiple stereo camera frames and an IMU (inertial measurement
35+ unit) as input. It uses stereo image pairs to find matching key
36+ points. Using the baseline between the camera pairs, it can estimate
37+ the distance to the key point. Using consecutive images, VSLAM
38+ can track the motion of key points to estimate the 3D motion of the
39+ camera-which is then used to compute odometry as an output
3940to navigation. Compared to the classic approach to VSLAM, this method
4041uses GPU acceleration to find and match more key points in real-time,
4142with fine tuning to minimize overall reprojection error.
4243
43- Key points depend on distinctive features in the left and right camera
44- image that can be repeatedly detected with changes in size, orientation,
44+ Key points depend on distinctive features in the images
45+ that can be repeatedly detected with changes in size, orientation,
4546perspective, lighting, and image noise. In some instances, the number of
4647key points may be limited or entirely absent; for example, if the camera
4748field of view is only looking at a large solid colored wall, no key
@@ -100,15 +101,19 @@ outdoor scenes.
100101
101102## Performance
102103
103- | Sample Graph<br /><br /> | Input Size<br /><br /> | AGX Orin<br /><br /> | Orin NX<br /><br /> | Orin Nano 8GB<br /><br /> | x86_64 w/ RTX 4060 Ti<br /><br /> |
104- | -------------------------------------------------------------------------------------------------------------------------------------------------| --------------------------| -------------------------------------------------------------------------------------------------------------------------------------------------------------| ------------------------------------------------------------------------------------------------------------------------------------------------------------| --------------------------------------------------------------------------------------------------------------------------------------------------------------| ---------------------------------------------------------------------------------------------------------------------------------------------------------------|
105- | [ Visual SLAM Node] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_benchmark/blob/main/scripts/isaac_ros_visual_slam_node.py ) <br /><br /><br /><br /> | 720p<br /><br /><br /><br /> | [ 228 fps] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_benchmark/blob/main/results/isaac_ros_visual_slam_node-agx_orin.json ) <br /><br /><br />40 ms<br /><br /> | [ 127 fps] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_benchmark/blob/main/results/isaac_ros_visual_slam_node-orin_nx.json ) <br /><br /><br />74 ms<br /><br /> | [ 113 fps] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_benchmark/blob/main/results/isaac_ros_visual_slam_node-orin_nano.json ) <br /><br /><br />65 ms<br /><br /> | [ 456 fps] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_benchmark/blob/main/results/isaac_ros_visual_slam_node-nuc_4060ti.json ) <br /><br /><br />37 ms<br /><br /> |
104+ | Sample Graph<br /><br /> | Input Size<br /><br /> | Nova Carter<br /><br /> |
105+ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| ---------------------------| ---------------------------------------------------------------------------------------------------------------------------------------------------------|
106+ | [ Multicam Visual SLAM Live Graph] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_benchmark/blob/main/benchmarks/isaac_ros_perceptor_nova_benchmark/scripts/isaac_ros_visual_slam_graph.py ) <br /><br /><br />4 Hawk Cameras<br /><br /> | 1200p<br /><br /><br /><br /> | [ 30.0 fps] ( https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_benchmark/blob/main/results/isaac_ros_4_hawk_vslam_graph-carter_v2.4.json ) <br /><br /><br /><br /> |
107+
108+ > [ !Note]
109+ > This benchmark can only be run on a [ Nova Orin] ( https://developer.nvidia.com/isaac/nova-orin ) compatible system.
106110
107111---
108112
109113## Documentation
110114
111- Please visit the [ Isaac ROS Documentation] ( https://nvidia-isaac-ros.github.io/repositories_and_packages/isaac_ros_visual_slam/index.html ) to learn how to use this repository.
115+ Please visit the [ Isaac ROS Documentation] ( https://nvidia-isaac-ros.github.io/repositories_and_packages/isaac_ros_visual_slam/index.html ) to learn how to use
116+ this repository.
112117
113118---
114119
@@ -124,4 +129,4 @@ Please visit the [Isaac ROS Documentation](https://nvidia-isaac-ros.github.io/re
124129
125130## Latest
126131
127- Update 2023-10-18: Improved stability.
132+ Update 2024-05-30: Add support for multi-cam VIO
0 commit comments