Stop recompiling every LiDAR or camera driver in every workspace. Build once, deploy the Docker container anywhere, move on.
- Hardware Docker images
When we want to publish sensor data to ROS2, sometimes there is a ready-made package in the OS repositories. When there is not, we must build vendor drivers and ROS2 wrappers from source. That is slow, fragile, and hard to reproduce across laptops, CI, and robots.
The approach here is to package each driver (code + native deps + ROS2 wrapper) inside its own Docker image. That way a project only needs to build (or pull) a known image and run it.
Example docker-compose service to launch a RoboSense LiDAR:
services:
robosense_srvc:
image: ${IMG_ID}
network_mode: host
privileged: false
ipc: host
volumes:
- ./example_1.front_robosense_helios_16p_config.yaml:/tmp/config.yaml:ro
# - ./example_2.front_back_robosense_helios_16p_config.yaml:/tmp/config.yaml:ro
- ./cyclonedds_config.xml:/tmp/cyclonedds_config.xml:ro
environment:
TERM: xterm-256color
RCUTILS_LOGGING_BUFFERED_STREAM: "0"
RCUTILS_LOGGING_USE_STDOUT: "1"
RCUTILS_COLORIZED_OUTPUT: "1"
RCUTILS_CONSOLE_OUTPUT_FORMAT: "[{severity} {time}] [{name}]: {message} ({file_name}:L{line_number})"
RMW_IMPLEMENTATION: rmw_cyclonedds_cpp
CYCLONEDDS_URI: file:///tmp/cyclonedds_config.xml
ROS_DOMAIN_ID: "11"
NAMESPACE: test # Optional
ROBOT_NAME: robot # Required
CONFIG_FILE: /tmp/config.yaml # Required
NODE_OPTIONS: "name=robosense_lidar_ros2_handler,output=screen,emulate_tty=True,respawn=False,respawn_delay=0.0"
LOGGING_OPTIONS: "log-level=info,disable-stdout-logs=true,disable-rosout-logs=false,disable-external-lib-logs=true"
command: ["ros2", "launch", "rslidar_sdk", "sensor.launch.py"]This repository is organized to provide a modular approach to building Docker images for various hardware sensors in ROS2 environments. Here's a conceptual overview of what you'll find:
-
Base Docker Files: Core scripts and configurations for setting up the foundational Docker environment, including system dependencies, ROS2 installation, and build tools. These are reusable across different sensor types.
-
Example Multi-Sensor: A demonstration of how to combine multiple sensors into a single Docker image, showing best practices for multi-sensor setups and custom Dockerfile creation.
-
Sensors Directory: A collection of sensor-specific implementations, categorized by sensor type (cameras, IMUs, lidars). Each sensor folder contains:
- Installation scripts for vendor drivers and dependencies
- Compilation scripts for building ROS2 wrappers
- Standardized launch files for consistent sensor integration
- Example configurations and Docker setups to get started quickly
ROS2 distributions currently supported (see base_docker_files/ros_distros.yaml):
| ROS2 codename | Ubuntu base | Notes |
|---|---|---|
| Humble Hawksbill | 22.04 LTS | EOL May 2027 |
| Jazzy Jalisco | 24.04 LTS | EOL May 2029 |
Each sensor folder includes an examples/ directory with a Dockerfile and a build.py that builds a
single-driver image using the scripts under that sensor folder.
Examples:
python3 sensors/cameras/realsense/examples/build.py jazzy
python3 sensors/lidars/robosense/examples/build.py jazzy
python3 sensors/lidars/livox_gen2/examples/build.py jazzy
python3 sensors/imus/umx/examples/build.py jazzySee example_multi_sensor/README.md. The example shows how a user can create a custom Dockerfile that
clones this repository and reuses existing setup.sh and compile.sh scripts for multiple sensors.
python3 example_multi_sensor/build.py ubuntu:24.04 jazzy multi_sensor:jazzyEach image follows a common structure:
- Base system install (
base_docker_files/install_base_system.sh). - ROS2 install (
base_docker_files/install_ros.sh). - Sensor setup (
sensors/<type>/<name>/setup.sh). - rosdep + colcon mixin/metadata (
base_docker_files/rosdep_init_update_install.shandbase_docker_files/colcon_mixin_metadata.sh). - Compile (
sensors/<type>/<name>/compile.sh). - Install entrypoint (
base_docker_files/entrypoint.sh).
The example Dockerfiles in examples/ follow this pattern and can be copied to new projects.
The entrypoint script sources ${IMAGE_MAIN_USER_WORKSPACE}/install/setup.bash so ROS2 packages can be found:
#!/usr/bin/env bash
ws="${IMAGE_MAIN_USER_WORKSPACE:-}"
[ -n "${ws}" ] && [ -r "${ws}/install/setup.bash" ] && . "${ws}/install/setup.bash"
exec "$@"Each sensor provides a sensor.launch.py that wraps the vendor driver in a consistent launch entrypoint.
The Dockerfiles install that launch file into the driver package so it can be run with ros2 launch.
Check each sensor folder for examples of how to use the launch file and what parameters are supported, since they might vary a little among sensors.
This repository targets ROS2 only.
- Connect the sensor to your computer with an RJ45 cable (directly or through a hub/switch).
- Run
ifconfigand identify the network interface connected to the sensor. - Note the MAC address of that interface (example:
enx00e04c68020dwith MAC00:e0:4c:68:02:0d). - Open the network manager and select that interface.
- Optionally verify you selected the correct interface by checking that its MAC address matches the one from
ifconfig. - Set that interface to
Automatic (DHCP). - Open a terminal and start Wireshark. If needed, install it first:
sudo apt-get install -y wireshark. - In Wireshark, select the interface and start traffic capture.
- Inspect captured packets and find messages coming from the LiDAR that show its IP address.
- Configure your computer interface with an IP in the same subnet/range as the LiDAR.
- Verify connectivity with
ping <LiDAR_IP>.
