Skip to content

pratyakshbhandwal/Gesture-Drone-Controller

Repository files navigation

Hand Gesture Controlled Drone

This is a hand gesture based drone control project built using MediaPipe and CNN.

The system detects hand gestures from a live camera feed and converts them into drone control commands.
A dummy drone controller is used to simulate drone movement.


Gestures Supported

  • UP – Move drone up
  • DOWN – Move drone down
  • LEFT – Move drone left
  • RIGHT – Move drone right
  • CIRCLE (V sign) – Drone circles around the person

If no gesture is detected, the drone holds its position.


How It Works

Camera → MediaPipe Hand Landmarks → CNN / Rule Logic → Drone Commands

MediaPipe detects 21 hand landmarks (63 values).
These values are used to detect gestures either:

  • directly using rules (MediaPipe only), or
  • using a CNN trained on a custom dataset.

Files

  • collect_data.py – Collect gesture data
  • train_cnn.py – Train CNN model
  • live_cnn_gesture.py – Live gesture detection (CNN)
  • mediapipe_only_control.py – Gesture detection using only MediaPipe
  • drone_controller.py – Dummy drone control
  • circle_motion.py – Circling logic

How to Run

Activate virtual environment: Run the live_cnn_gesture.py file

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages