-
Notifications
You must be signed in to change notification settings - Fork 3
Home
Welcome to the shared_autonomy_perception wiki!
This repo provides a framework designed for testing shared autonomy for perception in the context of manipulation. Thus, in addition to an implementation of Grabcut3D and human-machine interfaces, it includes a state machine that performs a simple table-clearing task (using MoveIt! for the manipulation parts). It also has the tools required for comparing fully autonomous ORK cluster-based segmentation with grabcut's human-assisted result.
The following diagram provides an overview of how the different pieces fit together.
(assuming a PR2, but should be easily configurable to any robot controlled by MoveIt! with a kinect for perception)
- Install MoveIt!: `sudo apt-get install ros-hydro-moveit*
- Install PR2 specific packages: `sudo apt-get install ros-hydro-pr2*
- The rest assumes that your hydro workspace is
~/catkin_hydro - Download the shared_autonomy_perception repo (master branch):
cd ~/catkin_hydro/src; git clone https://github.com/SharedAutonomyToolkit/shared_autonomy_perception.git - Download the cluster_grasp_planner (extracted from arm_navigation, released as a single package), (hydro-devel branch):
cd ~/catkin_hydro/src; git clone https://github.com/bosch-ros-pkg/cluster_grasp_planner.git; cd cluster_grasp_planner; git checkout hydro-devel - Install ORK, following the directions here
- (on robot):
roslaunch shared_autonomy_launch robot.launchThis is a slightly modified version of the /etc/ros/robot.launch file that starts the head-mounted kinect, but not any of the other cameras or the EKF localization. roslaunch pr2_moveit_config move_group.launchroslaunch pr2_moveit_config moveit_rviz.launch-
roslaunch shared_autonomy_launch clear_table_unified.launch use_grabcut:=true use_im:=falseThis launches the rest of the nodes required for clearing the table. Use the command line arguments to switch between shared autonomy using grabcut (use_grabcut:=true) and fully autonomous operation using ORK (use_grabcut:=false). You can also switch between using the Interactive Marker interface to grabcut (use_im:=false) and the JavaScript one (use_im:=true). If using Interactive Markers, be sure to add them to your rviz display. If using JavaScript, open this file in your browser: shared_autonomy_perception/js_hmi/pages/interactive_segmentation_interfaces.html
Several of the packages are obsolete:
- augmented_object_selection
- bosch_object_segmentation
- image_segmentation_demo
Others just contain example code:
- random_snippets
- webtools
