Autonomous drone navigation using optical flow, IMU, and lidar when GPS is unavailable.
This project enables drones to navigate autonomously in GPS-denied environments with centimeter-level accuracy using multi-sensor fusion and SLAM algorithms. The system is designed for warehouse inspection, search & rescue, and mapping missions.
- Multi-sensor data acquisition (Optical Flow, IMU, Lidar)
- Real-time SLAM with occupancy grid mapping
- Local & global path planning with obstacle avoidance
- PX4 offboard integration via MAVSDK
- Vision-based position hold and waypoint navigation
- Multi-session mapping (Map Persistence)
- Dynamic object filtering
- Vision pose estimation for EKF2
- Fail-safe automatic landing
- Trajectory tracking with Pure Pursuit
| Feature ID | Feature Name | Description | Priority |
|---|---|---|---|
| 1.1.1 | Optical Flow Integration | Real-time velocity estimation from downward-facing camera | P0 |
| 1.1.2 | IMU Data Processing | High-frequency pose estimation and vibration filtering | P0 |
| 1.1.3 | 2D Lidar SLAM | Horizontal plane mapping and localization | P0 |
| 1.1.4 | Sensor Synchronization | Microsecond-level timestamp alignment across sensors | P1 |
| Feature ID | Feature Name | Description | Priority |
|---|---|---|---|
| 1.1.6 | Graph-Based SLAM | Loop closure and pose graph optimization | P0 |
| 1.1.7 | Occupancy Grid Mapping | 2D/2.5D environment representation | P0 |
| 1.1.8 | Dynamic Object Filtering | Remove moving obstacles from static map | P1 |
| 1.1.9 | Map Persistence | Save/load maps for repeated missions | P1 |
| Feature ID | Feature Name | Description | Priority |
|---|---|---|---|
| 1.1.11 | Local Path Planner | Dynamic window approach for real-time avoidance | P0 |
| 1.1.12 | Global Path Planner | A* algorithm on occupancy grid | P0 |
| 1.1.13 | Position Hold Mode | Stable hover without GPS drift | P0 |
| 1.1.14 | Waypoint Navigation | Execute pre-defined waypoints | P1 |
| Feature ID | Feature Name | Description | Priority |
|---|---|---|---|
| 1.1.16 | MAVSDK Offboard Mode | Send position setpoints via MAVLink | P0 |
| 1.1.17 | Vision Pose Estimation | Inject SLAM pose into EKF2 | P0 |
| 1.1.18 | Fail-Safe Logic | Automatic landing on SLAM failure | P0 |
1. SENSOR ACQUISITION (Epic 1)
ββ optical_flow_integration β Visual velocity
ββ imu_processing β Orientation & acceleration
ββ lidar_slam_2d β Laser scans
β
2. SENSOR FUSION (Epic 1)
ββ sensor_synchronization β Time-aligned data
β
3. LOCALIZATION & MAPPING (Epic 2)
ββ graph_slam β Pose estimation with loop closure
ββ occupancy_grid_mapping β Environment representation
ββ dynamic_filter β Remove moving objects
ββ map_persistence β Save/load maps
β
4. PATH PLANNING (Epic 3)
ββ global_planner (A*) β Optimal waypoint paths
ββ local_planner (DWA) β Real-time obstacle avoidance
β
5. TRAJECTORY EXECUTION (Epic 3)
ββ trajectory_tracker (Pure Pursuit) β Smooth path following
ββ position_hold (PID) β Stable hovering
ββ waypoint_navigation β Mission execution
β
6. FLIGHT CONTROL (Epic 4)
ββ vision_pose_estimator β SLAM β EKF2 fusion
ββ mavsdk_offboard β Position setpoints via MAVLink
ββ failsafe_controller β Emergency landing on failure
β
7. VEHICLE EXECUTION
ββ PX4 Flight Controller β Motor commands
| Feature | Inputs | Outputs | Integrates With |
|---|---|---|---|
| optical_flow_integration | Camera frames | /optical_flow/velocity |
sensor_synchronization |
| imu_processing | IMU raw data | /imu/data |
sensor_synchronization, graph_slam |
| lidar_slam_2d | Laser scans | /scan |
sensor_synchronization, graph_slam |
| sensor_synchronization | All sensor topics | /fused/odom |
graph_slam |
| graph_slam | Fused odometry | /slam/pose, /odom |
occupancy_grid_mapping, vision_pose_estimator |
| occupancy_grid_mapping | SLAM pose + scans | /map |
global_planner, local_planner |
| dynamic_filter | /map + velocities |
/static_map |
map_persistence |
| map_persistence | Filtered map | Saved maps | occupancy_grid_mapping |
| global_planner | /static_map, goal |
/global_plan |
trajectory_tracker |
| local_planner | /map, /odom |
/local_plan |
trajectory_tracker |
| trajectory_tracker | Plans + odometry | /cmd_vel |
mavsdk_offboard |
| position_hold | Target pose | /cmd_vel_hold |
mavsdk_offboard |
| waypoint_navigation | Mission file | /goal_pose |
global_planner |
| vision_pose_estimator | /odom |
/mavros/vision_pose/pose |
PX4 EKF2 |
| mavsdk_offboard | /cmd_vel topics |
/mavros/setpoint_raw/local |
PX4 |
| failsafe_controller | /odom, /mavros/state |
/failsafe/active, LAND mode |
PX4 |
For Localization:
- optical_flow_integration + imu_processing + lidar_slam_2d β sensor_synchronization β graph_slam
For Navigation:
- graph_slam β occupancy_grid_mapping β global_planner + local_planner β trajectory_tracker
For Flight Control:
- trajectory_tracker OR position_hold β mavsdk_offboard β PX4
- graph_slam β vision_pose_estimator β PX4 EKF2
For Safety:
- graph_slam β failsafe_controller β PX4 (emergency landing)
| Risk | Probability | Impact | Mitigation |
|---|---|---|---|
| SLAM drift in featureless areas | High | Critical | Add AprilTag landmarks, fuse multiple sensor modalities |
| Computational overload | Medium | High | Offload to Coral TPU, optimize with C++ nodes |
| Magnetic interference | High | Medium | Use optical flow as primary yaw source, disable mag fusion |
| Lidar motion distortion | Medium | Medium | Implement motion compensation using IMU pre-integration |
- β Drone hovers stably for 2 minutes in 5mΓ5m room (<20cm drift)
- β Successfully navigates 4-waypoint mission with obstacle avoidance
- β SLAM relocalizes after being picked up and moved (kidnapped robot problem)
- β All safety tests pass in simulation + real-world validation
- β Documentation includes calibration procedures and tuning guide
- PX4-compatible flight controller (Pixhawk 4, Cube Orange)
- Raspberry Pi 4 (4GB+)
- RPLidar A1/A2, Optical Flow sensor (PMW3901)
- ROS2 Humble, MAVSDK-Python
# Navigate to project directory
cd "Autonomous Navigation & Control"
# Build all packages (16 ROS2 packages)
./setup_workspace.sh
# Source workspace
source install/setup.bash# 1. Launch sensor fusion (Terminal 1)
ros2 launch sensor_synchronization sync.launch.py
# 2. Launch SLAM system (Terminal 2)
ros2 launch graph_slam graph_slam.launch.py
# 3. Launch occupancy grid mapping (Terminal 3)
ros2 launch occupancy_grid_mapping mapping.launch.py
# 4. Launch MAVROS - connect to PX4 (Terminal 4)
ros2 launch mavros px4.launch fcu_url:=udp://:14540@
# 5. Launch vision pose estimator (Terminal 5)
ros2 launch vision_pose_estimator vision_pose.launch.py
# 6. Launch failsafe controller (Terminal 6)
ros2 launch failsafe_controller failsafe.launch.py
# 7. Launch navigation stack (Terminal 7)
ros2 launch global_planner planner.launch.py &
ros2 launch local_planner planner.launch.py &
ros2 launch trajectory_tracker tracker.launch.py
# 8. Launch offboard control (Terminal 8)
ros2 launch mavsdk_offboard offboard.launch.py
# 9. Execute waypoint mission (Terminal 9)
ros2 launch waypoint_navigation waypoint.launch.py \
mission_file:=missions/example_mission.yaml \
auto_start:=true# Minimal setup for testing position hold
./setup_workspace.sh
source install/setup.bash
# Launch SLAM + Position Hold
ros2 launch graph_slam graph_slam.launch.py &
ros2 launch mavros px4.launch fcu_url:=udp://:14540@ &
ros2 launch vision_pose_estimator vision_pose.launch.py &
ros2 launch position_hold hold.launch.py &
ros2 launch failsafe_controller failsafe.launch.py-
optical_flow_integration - Visual odometry from camera
- 20 Hz velocity estimation
- Covariance support
- Simulation mode
-
imu_processing - High-rate orientation data
- 100 Hz IMU processing
- Complementary filter
- Calibration support
-
lidar_slam_2d - 2D laser SLAM
- Real-time scan matching
- TF broadcasting
- Loop closure detection
-
sensor_synchronization - Multi-sensor fusion
- 50ms time window
- Kalman filter fusion
- Synchronized odometry
-
graph_slam - Graph optimization SLAM
- Pose graph optimization
- Loop closure detection
- G2O integration
-
occupancy_grid_mapping - Probabilistic mapping
- Log-odds occupancy
- Dynamic updates
- Configurable resolution
-
dynamic_filter - Motion filtering
- Velocity-based detection
- Static map generation
- Threshold tuning
-
map_persistence - Map save/load
- YAML/PGM format
- Multi-session support
- Automatic saving
-
local_planner - Dynamic Window Approach
- Real-time obstacle avoidance
- Velocity space sampling
- Collision checking
-
global_planner - A* path planning
- Optimal path finding
- Occupancy grid search
- Waypoint generation
-
position_hold - PID hovering
- 3D position control (XY + Z)
- 50 Hz control rate
- Drift compensation
-
trajectory_tracker - Pure Pursuit
- Smooth path following
- Adaptive lookahead
- Curvature-based steering
-
waypoint_navigation - Mission execution
- YAML mission files
- Sequential execution
- Visual markers
-
mavsdk_offboard - MAVLink control
- Position setpoint streaming (20 Hz)
- Offboard mode management
- Auto-arm capability
-
vision_pose_estimator - EKF2 integration
- SLAM pose injection
- 30 Hz publishing
- Covariance support
-
failsafe_controller - Safety monitor
- SLAM health checking
- Automatic emergency landing
- Multi-stage warnings
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
