Mazen_4dw is a ROS 2 package designed for ROS 2 Foxy and later distributions. It simulates and controls a 4-wheel-drive (4WD) robot equipped with various sensors, such as a 2D LiDAR, camera, and other modules for perception and navigation.
The package supports Gazebo simulation and RViz visualization, along with YOLOv8-based object detection, making it suitable for research and development in autonomous robotics, SLAM, and AI-based perception.
- 4WD robot model with realistic physics in Gazebo
-
Integrated camera for image processing and computer vision tasks
-
2D LiDAR support for mapping, obstacle avoidance, and SLAM
-
Object recognition using YOLOv8 (Ultralytics)
-
RViz support for visualizing robot states and sensor data
-
Modular launch files for easy simulation and testing
Create project directory
mkdir -p ~/mazen_ws/src cd ~/mazen_ws/srcClone the project
git clone https://github.com/mazen-daghari/Mazen_4dw.gitBuild project
colcon build --symlink-installSource project
source install/setup.bash Launch Gazebo simulation
ros2 launch mazen_4wd gazebo_model.launch.pyLaunch Yolo v8
ros2 launch recognition launch_yolov8.launch.py- Create robot urdf
- Add sensors
- Add extended kalman filer
- Add yolo v8 model to simulation
- Add teleop twist keyboard script
-
Make sure all dependencies (e.g., YOLOv8, camera drivers, etc.) are correctly installed.
-
Tested on Ubuntu 20.04 with ROS 2 Foxy. Later ROS 2 versions like Humble and Iron should also work with minor adjustments.
-Copy models (e.g., person, SUV, stop sign, bus) to your ~/.gazebo/models directory. This step is only required once. (Note: This directory may be hidden; enable 'show hidden files' if needed).
-
For any further help contact me on dagmazen@gmail.com or via linkedin
-
The Gazebo simulation and the recognition package must be executed simultaneously. This requires initiating their respective launch files in distinct terminal instances.

