Arabic Sign Language gesture detection using mediapipe and openCV
This system extends Kazuhito00's Hand Gesture Recognition by adding support for:
-
The complete 28-letter Arabic alphabet
-
3 functional gestures (Space, Delete, Clear)
-
Real-time text output with Arabic script rendering
β 28 Arabic Letters: Custom-trained gesture models for all Arabic characters
π Utility Gestures:
- π Space: Insert space between words
- β Delete: Remove last character
- π§Ή Clear: Reset entire text
π Arabic Text Rendering: Proper RTL display with glyph shaping
β‘ Adjustable Sensitivity: Control detection speed via frame threshold
-
MediaPipe β Hand tracking
-
OpenCV β Camera processing & visualization
-
NumPy β Data handling
-
Model: CNN β Static gesture classification
This section includes the requirements, how to run the app, training protocol and other more detailed information about the original hand gesture detection model
Please view the README of Kazuhito00's Hand Gesture Recognition Repo: https://github.com/kinivi/hand-gesture-recognition-mediapipe
Additional requirements
- arabic_reshaper 3.0.0
- python-bidi 0.6.6
- pillow 11.2.1
For letter with index 23 in "keypoint_classifier_label.csv":
- (shift + f) "number +"
- Add 20 next to "+" index 0 to 9: add 0, index 10 to 19: add 10
- Run app
- Press k to enter training mode
- Press 2 (for 20 + 2 = 22) Note: CSV uses 0-based indexing while labels start from 1
- Make the gesture 20+ times
- Close app
- Open "keypoint_classification_EN" in Jupyter notebook & run all cells
- Training done β
Kazuhito00's Hand Gesture Recognition Repo: https://github.com/kinivi/hand-gesture-recognition-mediapipe
