Skip to content

Official repository for project: Intelligent prosthetic arm with EEG based control and Computer Vision for Bocconi University Association BAINSA

License

Notifications You must be signed in to change notification settings

Ramairus/Prosthetic-Arm-PlayingAround

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Intelligent Prosthetic Arm Project

Overview

The Intelligent Prosthetic Arm project is focused on developing a prosthetic hand controlled through EEG brain signals, combined with advanced computer vision and sensors for safe, real-time object interaction. The system incorporates machine learning, real-time image recognition, object detection, and sensor data to enable the prosthetic arm to mimic natural hand movements, providing a highly functional solution for users to interact with everyday objects, such as water bottles and more.

Table of Contents

Features

  • EEG-Based Control: The prosthetic arm is operated using brain signals (EEG), allowing intuitive control and natural movement.
  • Computer Vision for Object Detection and Grasping: The system uses YOLOv11 for real-time object detection, positioning the prosthetic hand correctly for successful grasps based on feedback.
  • Real-Time Grasp Detection: Integrated computer vision for determining the best grasp configuration for various objects using bounding boxes, segmentation, and a Vision Language Model (VLM).
  • Sensors for Safe Interaction: Equipped with force and temperature sensors to ensure safe and adaptive interaction with objects.
  • ROS2 Integration: ROS2 is used for seamless communication between the arm, computer vision, EEG module, and sensor feedback.

Project Structure

prosthetic_arm/
├── computer_vision/            # Vision processing modules
│   ├── frames/
│   ├── models/
│   ├── scripts/
│   ├── requirements.txt
│   ├── requirements_no_dep.txt
│   └── README.md
│
├── robotics/                   # Robotic control modules
│   └── arm_control.py
│
│── neuroscience/               # EEG Data Processing modules
│   └── eeg_processing.py
│
│── docker/                     # Container configuration
│   └── Dockerfile
│
├── docs/                       # Documentation
│   └── project_overview.md
│           
├── requirements.txt            # Project dependencies
└── README.md                   # Main documentation

Installation

Local Setup

  1. Clone the Repository
git clone https://github.com/yourusername/prosthetic_arm.git
cd prosthetic_arm
  1. Create a Python Virtual Environment
python3 -m venv prosthetic_env
source prosthetic_env/bin/activate
  1. Install Dependencies
pip install -r requirements.txt

Docker Setup

  1. Build the Docker Image
docker build -t prosthetic_arm_image .
  1. Run the Docker Container
docker run -it --rm prosthetic_arm_image

Usage

  1. Object Detection and Grasp Validation Run the ObjectDetector and GraspValidator classes to detect objects and validate grasping in real-time using your system camera:
python3 computer_vision/scripts/main.py

Roadmap

  • Set up object detection with YOLOv11
  • Implement grasp detection using bounding boxes
  • Integrate EEG signal processing for real-time control
  • Finalize force and temperature sensor integration
  • Optimize communication with Raspberry Pi for real-time operation
  • Expand to detect various object types beyond water bottles
  • Real-world testing on prosthetic hardware
  • Safety features for preventing excessive pressure

Contributors

License

This project is licensed under the Apache License 2.0. See the LICENSE file for details.

About

Official repository for project: Intelligent prosthetic arm with EEG based control and Computer Vision for Bocconi University Association BAINSA

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.0%
  • Dockerfile 2.0%