A Collection of LiDAR-Camera-Calibration Papers, Toolboxes and Notes.
Outline
For applications such as autonomous driving, robotics, navigation systems, and 3-D scene reconstruction, data of the same scene is often captured using both lidar and camera sensors. To accurately interpret the objects in a scene, it is necessary to fuse the lidar and the camera outputs together. Lidar camera calibration estimates a rigid transformation matrix (extrinsics, rotation+translation, 6 DoF) that establishes the correspondences between the points in the 3-D lidar plane and the pixels in the image plane.
C: camera, L: LiDAR, a: automaic, m: manual
| Paper | Feature | Optimization | Toolbox | Note |
|---|---|---|---|---|
| LiDAR and Camera Calibration Using Motions Estimated by Sensor Fusion Odometry, 2018 | C: motion (ICP), L: motion (VO) | hand-eye calibration | * | * |
| Toolbox | Introduction | Note |
|---|---|---|
| Apollo sensor calibration tools | targetless method, no source code | CN |
| Autoware camera lidar calibrator | pick points mannually, PnP | * |
| Autoware calibration camera lidar | checkerboard, similar to LCCT | CN |
| livox_camera_lidar_calibration | pick points mannually, PnP | * |
| OpenCalib | target-based, target-less, mannual | OpenCalib: A Multi-sensor Calibration Toolbox for Autonomous Driving |
| tier4/CalibrationTools | target-based, mannual | * |
