Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 33 additions & 0 deletions docs/development/manipulation/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Area Overview
RoBorregos @Home's Object Manipulation area handles the robot's physical interaction with the environment, including objects and people. This area is crucial for tasks such as picking and placing objects, handling specific items (such as a watter bottle or a bag) and even interacting with people (receiving or giving objects). As such, this area leads 3 core technical areas of the robot:

- **2D Vision**: Collaborating with the Vision area, it is used to identify and extract information from 2D images, such as object detection and segmentation.
- **3D Perception**: Revolves around extracting environment and object information from 3D data.
- **Motion Planning**: Integrates and develops motion planning algorithms to ensure the robot can move safely and efficiently in its environment.

Given its requirements, this area also develops general tasks useful for other areas, providing methods for moving the arm, keeping the URDF and motion planning updated and collaborating in developing simulation environments.

# Node Overview

## General Nodes

### Manipulation Server
- Manages requests made for pick, place and other object interactions through the robot arm
- Connects to every manipulation node, such as object detectors, 3D perception, motion planning and more.

### Object Detector 2D
- Uses Computer Vision models to detect and segment objects in 2D images
- Returns both 2D and 3D object locations

### Perception 3D
Several nodes spawning capabilities such as:
- Extracting surfaces and objects
- Detecting and extracting object of interest' pointclouds and reconstructed meshes

### Place Server
- Incorporates algorithms to determine the best position for placing objects

### Pick and Place Server
- Calls and manages motion planning algorithms to pick and place objects

---
32 changes: 32 additions & 0 deletions docs/development/manipulation/spotlights.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Weekly Spotlights

This page is a collection of weekly spotlights that highlight the progress of the Object Manipulation team. Each spotlight is a summary of the work done by the team in a week.

## 2025-02-27

### News
- Welcoming new team member: Ricardo Guerrero
- New team for the February-May period:
* Iván Romero Wells
* José Luis Dominguez
* David Vázquez
* Alexis Chapa
* Alejandro González
* Ricardo Guerrero
* Gerardo Fregoso
* Yair Reyes
* Emiliano Flores

This is the largest team we have had so far, with 9 members.

### Done
- Table/Surface extraction migrated
- MoveIt2 interface on Python integrated within subtask manager
### In Progress
- Object 2D detection and extraction
- Object 3D extraction -> Clustering and mesh reconstruction
- Pick and Place server for motion planning

### Notes
- A pick and place demo has been scheduled for March 15th, which will mark the start of development into the next phase of the project, involving new motion planning methods and accelerating 3D perception.
- A pick and place demo has been scheduled for March 15th, which will mark the start of develoomoent into the next phase of the project, involving new motion planning methods and accelerating 3D perception.