Skip to content

KaterynaZakharova/Vision-System-For-Robot-Arm-Control

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Vision-System-For-Robot-Arm-Control

Controls 6-DOF Robot Arm by the camera-based system. Tracks surface movements and adjust the robot's position accordingly.


The camera system consists of two external cameras (ideally, there should be more) placed on top of the surface (1) and at the bottom on a side (2). This system gives 5-DOF movement estimations of the surface.

Tracking algorithm: a segmentation model (wound segmentation here) to crop ROI, SIFT to extract features from ROI, and FLANN as a feature matcher.

Used robot: SEED S6H4D robot arm

Requirements: works on CPU (tested on: AMD Ryzen 7 5825U | 16 GB RAM)


Demos: https://drive.google.com/drive/folders/1Gb3p0PHmpOtv9YcD3E61mrooz2xuyRO4?usp=sharing


To run the program: python3 main.py


To reconfigure the system update variables in /config/*:

  • camera.py: cameras' ID, frame properties, calibration coefficients
  • model.py: used model, threshold, post-processing
  • robot.py: connection to a robot, home position, used speed
  • tracker.py: feature extraction and matching algorithms, matcher threshold, approximation function

Specific to the SEED robot arm a communication protocol: /helpers/communication.py.

A wound segmentation model in /model/.

If you have more/fewer cameras, change the number of threads in main.py.

About

The camera-based system controls the 6-DOF robot arm, tracking surface movements and adjusting the robot's position accordingly.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors