A gesture-based virtual keyboard that allows you to type using hand gestures captured via webcam. This innovative application uses computer vision to detect finger movements and translates them into keyboard inputs.
- Touchless Typing: Type without physical contact using hand gestures
- Real-Time Hand Tracking: Accurate detection of 21 hand landmarks using MediaPipe
- Dynamic Text Display: Auto-expanding text area with word wrapping
- Visual Feedback: Clear indicators for key activation and shift status
- Full Keyboard Functionality: Support for letters, punctuation, and special keys
- Cross-Platform: Works on Windows, macOS, and Linux
- Customizable Layout: Easily modify keyboard layout and appearance
- Python 3.8 or higher
- Webcam
- pip (Python package manager)
-
Clone the repository: bash git clone https://github.com/R2-STAR/virtual-ai-keyboard.git cd virtual-ai-keyboard
-
Install the required dependencies: bash pip install -r requirements.txt
-
Run the application: bash python main.py
-
Position your hand in view of the webcam
-
Move your index finger over the desired key
-
Pinch your thumb and index finger together to "press" the key
-
Use the special keys:
- Shift: Toggle uppercase characters
- Space: Insert space
- Ent: Enter key
- Bac: Backspace
- Clear: Clear all text
-
Press 'q' to quit the application
The virtual keyboard follows this layout:
Q W E R T Y U I O F
A S D F G H J K L ;
Shift Z X C V B N M , .
Space Ent Bac Clear
You can easily customize the keyboard by modifying the KEYBOARD_LAYOUT array in main.py:
python
KEYBOARD_LAYOUT = [
["Q", "W", "E", "R", "T", "Y", "U", "I", "O", "P"],
["A", "S", "D", "F", "G", "H", "J", "K", "L", ";"],
["Shift", "Z", "X", "C", "V", "B", "N", "M", ",", "."],
["Space", "Ent", "Bac", "Clear"]
]
Adjust detection sensitivity by changing these parameters: python self.detector = HandDetector(detectionCon=0.7, maxHands=1) # Lower = more sensitive length < 45 # Increase for easier activation
Change colors by modifying the COLORS dictionary: python self.COLORS = { "primary": (255, 0, 0), # Blue instead of pink "highlight": (0, 255, 255),# Cyan instead of green # ... other color values }
virtual-ai-keyboard/
│
├── main.py # Main application file
├── requirements.txt # Project dependencies
├── README.md # Project documentation
├── LICENSE # MIT License
├── assets/ # Resources folder
│ ├── demo.gif # Application demo
│ └── screenshot.png # Application screenshot
└── examples/ # Usage examples
└── basic_usage.py # Basic implementation example
- Hand Detection: Uses CVZone's HandDetector module based on MediaPipe to detect hands and 21 landmarks
- Gesture Recognition: Identifies pinch gestures between index finger and thumb for key presses
- Virtual Keyboard Rendering: Draws a customizable keyboard layout on the video feed
- Input Simulation: Uses pynput to simulate actual keyboard input
- Text Display: Implements dynamic text area with word wrapping for long content
- OpenCV: Computer vision library for image processing
- NumPy: Numerical computing library
- CVZone: Simplified computer vision utilities built on MediaPipe
- pynput: Cross-platform input control library
- HandDetector: Detects and tracks hand landmarks in real-time
- Keyboard Controller: Simulates physical keyboard input
- Dynamic UI: Adjusts based on content length and user interaction
- Gesture Processing: Calculates distances between fingers to detect interactions
- Ensure good lighting conditions for accurate hand tracking
- Use a plain background for better detection
- Position yourself approximately 1-2 feet from the camera
- Keep your hand within the camera's field of view
- For best results, use a 720p or higher resolution webcam
Check out the examples/ folder for different implementation examples:
- basic_usage.py: Simple implementation of the virtual keyboard
Run the example: bash python examples/basic_usage.py
We welcome contributions! Please feel free to submit issues, feature requests, or pull requests.
- Fork the repository
- Create your feature branch (git checkout -b feature/amazing-feature)
- Commit your changes (git commit -m 'Add some amazing feature')
- Push to the branch (git push origin feature/amazing-feature)
- Open a Pull Request
Please read our Contributing Guidelines for more details.
- Performance may decrease in low-light conditions
- Rapid typing might not be captured accurately
- Complex backgrounds can reduce tracking accuracy
- Very fast hand movements may not be detected
- Word prediction and auto-complete features
- Multi-language support
- Customizable themes and layouts
- Gesture shortcuts for common actions
- Mobile device compatibility
- Voice command integration
- Training mode for new users
- Save/load custom layouts
This project is licensed under the MIT License - see the LICENSE file for details.
- CVZone for the excellent hand tracking module
- OpenCV for computer vision capabilities
- MediaPipe for the hand tracking models
- pynput for system-level input control
- v1.0.0 (2023-10-15)
- Initial release
- Basic virtual keyboard functionality
- Hand gesture detection
- Real-time typing feedback