Skip to content

This project features a hardware-based sign-language glove translating gestures into text or speech. Flex sensors on the fingers detect bending and send signals to a microcontroller, which interprets the gestures.

Notifications You must be signed in to change notification settings

H-strangeone/Sign-Language-Glove

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 

Repository files navigation

🧤 Sign-Language-Glove

GitHub Repo size GitHub contributors GitHub issues GitHub license

Welcome to the Sign-Language-Glove project! This innovative hardware-based solution translates American Sign Language (ASL) gestures into text or speech in real-time. Leveraging the power of Arduino Uno, flex sensors, and ADXL335 accelerometers, the Sign-Language Glove aims to bridge the communication gap between the deaf and hearing communities.

gestureglove


🚀 Features

  • Real-Time Translation: Converts ASL gestures into text and speech instantly.
  • Wireless Connectivity: Uses Bluetooth for seamless connection to mobile devices and computers.
  • User-Friendly Interface: Mobile app and desktop software for easy interaction and customization.
  • Extensive Vocabulary: Supports a wide range of ASL gestures with the ability to add custom signs.
  • Data Logging: Records gesture data for analysis and improvement.

🛠️ Technology Stack

Hardware

  • Arduino Uno: Central microcontroller for data processing.
  • Flex Sensors: Detect finger movements.
  • ADXL335 Accelerometer: Tracks hand orientation and motion.
  • Bluetooth Module: For wireless communication with mobile devices and computers.

Software

  • Programming Languages: C/C++ for Arduino, Python for data processing.
  • Machine Learning: TensorFlow or PyTorch for gesture recognition.
  • Mobile App: React Native for cross-platform compatibility.
  • Desktop App: Electron.js for a unified experience across OS.

📦 Installation

Hardware Setup

  1. Attach the flex sensors to each finger of the glove.
  2. Connect the ADXL335 accelerometer to the Arduino Uno.
  3. Pair the Arduino Uno with a Bluetooth module.
  4. Upload the firmware to the Arduino Uno using the Arduino IDE.

Software Setup

  1. Clone this repository:

    git clone https://github.com/Slygriyrsk/Sign-Language-Glove.git
  2. Navigate to the project directory:

    cd Sign-Language-Glove
  3. Install the required dependencies:

    pip install -r requirements.txt
  4. Run the gesture recognition script:

    python gesture_recognition.py

📱 Bluetooth App Designing

The mobile application provides an interface for users to interact with the device, including features for converting sign language to audio or video signs.

🔗 Connect Bluetooth

Establish a connection between your phone and the glove by selecting the appropriate Bluetooth address.

mobile

📊 Bluetooth App Block Diagram

This is done using MIT App Inventor: MIT App Inventor

bluetooth


📈 Readings

Here are some basic readings from our records:

readings


🎉 Usage

  1. Wear the glove and ensure all sensors are securely attached.
  2. Turn on the Arduino Uno and pair it with your mobile device or computer via Bluetooth.
  3. Open the mobile or desktop app.
  4. Perform ASL gestures; the corresponding text and speech output will be displayed and played in real-time.

🤝 Contribution

Contributions are always welcome! Please see contributing.md for ways to get started.

Please adhere to this project's code of conduct.

  • Report Bugs: Use the Issues section to report any bugs or feature requests.
  • Fix Issues: Fork the repository, make your changes, and submit a pull request.
  • Improve Documentation: Help us improve the project's documentation by adding tutorials, examples, and clarifications.

🔗 Links


👨‍💻 Authors

  • @Slygriyrsk
  • @maheshkatyayan

🙌 Acknowledgements

This project would not have been possible without the dedicated efforts and contributions of our amazing team members:

  • Saharsh Kumar: Team Leader, developed the Bluetooth communication application, ensuring seamless connectivity between the glove and mobile devices. Team Leader
  • Hitesh Kumar: Focused on hardware integration and wrote the core code for the Arduino Uno, flex sensors, and ADXL335 accelerometers. Hardware Engineer
  • Mahesh K Katyayan: Worked on the signal processing and machine learning aspects, laying the groundwork for future innovations in gesture recognition. Machine Learning Engineer

Thank you for your hard work and commitment to making this project a success! 🎉 Thank You

Project Completed Contributors Last Commit

About

This project features a hardware-based sign-language glove translating gestures into text or speech. Flex sensors on the fingers detect bending and send signals to a microcontroller, which interprets the gestures.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published