This is a hand gesture based drone control project built using MediaPipe and CNN.
The system detects hand gestures from a live camera feed and converts them into drone control commands.
A dummy drone controller is used to simulate drone movement.
- UP – Move drone up
- DOWN – Move drone down
- LEFT – Move drone left
- RIGHT – Move drone right
- CIRCLE (V sign) – Drone circles around the person
If no gesture is detected, the drone holds its position.
Camera → MediaPipe Hand Landmarks → CNN / Rule Logic → Drone Commands
MediaPipe detects 21 hand landmarks (63 values).
These values are used to detect gestures either:
- directly using rules (MediaPipe only), or
- using a CNN trained on a custom dataset.
collect_data.py– Collect gesture datatrain_cnn.py– Train CNN modellive_cnn_gesture.py– Live gesture detection (CNN)mediapipe_only_control.py– Gesture detection using only MediaPipedrone_controller.py– Dummy drone controlcircle_motion.py– Circling logic
Activate virtual environment:
Run the live_cnn_gesture.py file