Skip to content

VRP-github/HandSign-Detection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hand Sign Detection and Classification

This project uses computer vision to detect and classify hand gestures captured via a webcam. The project consists of two main components: data collection for creating a custom dataset of hand gestures and real-time classification of these gestures using a pre-trained model.


Table of Contents


Overview

This project provides tools to:

  1. Collect images of hand gestures for training a machine learning model.
  2. Detect hand gestures in real-time and classify them into predefined categories using a deep learning model.

Features

  • Hand Detection: Detects hands in the video feed using OpenCV and cvzone's HandDetector.
  • Dataset Creation: Saves cropped hand images for creating a custom dataset.
  • Gesture Classification: Classifies gestures into categories (e.g., "A", "B", "C", "OK") using a pre-trained Keras model.
  • Real-Time Feedback: Displays predictions and bounding boxes around detected hands on the live feed.

Dependencies

The project requires the following Python libraries:

  • cv2 (OpenCV)
  • cvzone
  • numpy
  • math

Install these dependencies using pip:

pip install opencv-python cvzone numpy

Setup Instructions

  1. Clone the repository:

    git clone https://github.com/VRP-github/HandSign-Detection.git
  2. Ensure the following folders and files are present:

    • Images/ for storing collected gesture images.
    • Model/keras_model.h5 for the pre-trained classification model.
    • Model/labels.txt for gesture labels.
  3. Connect a webcam to your system.


Usage

Data Collection

  1. Run the dataCollection.py script to collect gesture images:
    python dataCollection.py
  2. Make gestures in front of the camera. Press s to save images of your gesture into the Images/ folder.
  3. Use the saved images to train your custom gesture classification model.

Hand Gesture Classification

  1. Ensure the pre-trained model and labels file are present in the Model/ folder.
  2. Run the test.py script to start gesture classification:
    python test.py
  3. Wave your hand in front of the camera to see real-time predictions displayed on the screen.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages