Skip to content

workingpayload/Hand-Gesture-Recognition

 
 

Repository files navigation

Hand-Gesture-Recognition

Master's mini project.

Project members: Srishty Singh(https://www.github.com/srishty-singh2k), Raj Pratap Singh(https://www.github.com/workingpayload)

We perform Hand Gesture Recognition, making use of Google's Mediapipe framework in python, without any physical aid being affixed to hands.

MediaPipe offers customizable ML solutions. Mediapipe Hands is a hand and finger tracking solution. It works by deducing 21 3D landmarks of a hand. For which the pipeline consists of two modules: a palm detection model after that a hand landmark model. The palm detector achieves an average precision of 95.7%. The hand detector performs keypoint localization of all 21 hand-knuckle coordinates in the detected hand regions provided by palm detector employing regression.

We capture live video feed from camera using opencv library then pass it to Mediapipe that returns the landmarks' details. Afterwards we operate on those landmarks to implement several trivial use cases. By computing relative positioning, distance among landmarks, linear alignment we implement virtual mouse, finger counter, volume controller, rock-paper-scissors game. The required GUI is brought about using opencv.

LANDMARKS TRACKING

Sample.py

Tracking

VIRTUAL MOUSE POINTER

MenuScreen.py

Pointer

FINGER COUNTER

FingerCounter.py

Counting

VOLUME CONTROLLER

VolumeControl.py

Volume

ROCK-PAPER-SCISSORS GAME

RockPaperScissorsGame.py

RPC

SNAKE GAME

SnakeGame.py

Snake

About

College mini project of Hand Gesture Recognition using Mediapipe from live video stream in python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%