This repository contains simulation models, and corresponding motion planning and controlling demos of the robotic manipulator xArm6 from UFACTORY. There are five modules:
- aruco_calibration
- perception_etflab
- real_bringup
- sim_bringup
- RPMPLv2 (https://github.com/roboticsETF/RPMPLv2.git)
The development and test environment are tested on Ubuntu 22.04 + ROS2 Humble.
The paper describing the implementation details of the proposed real-time DRGBT algorithm is available at link1-to-paper or link2-to-paper.
Choose a location for a target workspace, e.g.:
cd ~/
and then clone the repository:
git clone https://github.com/roboticsETF/xarm6-etf-lab.git
2.2 Install required packages and dependencies, update "xarm6-etf-lab" repository and its submodules, and then build the full container
cd ~/xarm6-etf-lab # Not required if you are cloning through IDE (e.g., VS Code)
make full_build_container
Please upload the file xarm_user_params.yaml to the folder xarm6-etf-lab/src/external_modules/xarm_ros2/xarm_api/config.
Optionally, if you want to change the robot initial configuration (all joint angles are set to zero by default), open the file ~/xarm6-etf-lab/src/external_modules/gazebo_ros2_control/gazebo_ros2_control/src/gazebo_system.cpp, and find the function GazeboSystem::registerJoints. Then, find the following code line:
initial_position = get_initial_value(joint_info.state_interfaces[i]);
Instead of that line, add the following code:
if (j == 3)
initial_position = M_PI;
else if (j == 4)
initial_position = M_PI_2;
else
initial_position = get_initial_value(joint_info.state_interfaces[i]);
Similarly, you can set any desired angle for the j-th joint.
After this, build "xarm6-etf-lab" repository again by using:
make build
make build
source source_dirs.bash
First, type:
cd ~/xarm6-etf-lab
Then, launch the robot using one of the following four options:
ros2 launch sim_bringup static_xarm6_etflab.launch.py
# or just type:
make sim_static
Obstacles are static, and they are defined within world/etflab.world file in sim_bringup library (e.g., see example in world/etflab1.world). After combining point clouds from two cameras (left and right one), a combined point cloud is read from pointcloud_combined topic. Then, obstacles are published at a rate of cca. 5 [Hz] to objects_cloud topic after their segmentation is done. All configuration settings can be set within perception node in the yaml file /perception_etflab/data/real_perception_etflab_config.yaml from perception_etflab library. Use this option if you want to simulate readings from cameras with their uncertainties.
ros2 launch sim_bringup dynamic_xarm6_etflab.launch.py
# or just type:
make sim_dynamic
Obstacles are randomly generated and they are dynamic (they move in robot's workspace). The law of their motion is defined within perception_etflab/src/environments/Obstacles.cpp file. Point cloud representing obstacles is generated within the code, and then corresponding obstacles are published at a rate of 1/period [Hz] to objects_cloud topic. All configuration settings (including period) can be set within perception and random_obstacles nodes in the yaml file /perception_etflab/data/sim_perception_etflab_config.yaml from perception_etflab library.
Note that if there are defined static obstacles in world/etflab.world file from sim_bringup library, they will be shown, but NOT considered during the planning!!! This option does not simulate readings from cameras, yet assume that all obstacles' info are already read.
ros2 launch sim_bringup xarm6_moveit_gazebo.launch.py
ros2 launch xarm_moveit_config xarm6_moveit_realmove.launch.py robot_ip:=192.168.1.236 [add_gripper:=true]
Note: For each test file from apps folder, there is a corresponding yaml file withing data folder, where all necessary configurations can be set.
# In the new tab type:
cd ~/xarm6-etf-lab
# In C++
ros2 run sim_bringup test_move_xarm6
# In Python
ros2 run sim_bringup test_move_xarm6.py
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run sim_bringup test_planners
Sim_demo2.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run sim_bringup test_task_planning
# In the new tab type:
cd ~/xarm6-etf-lab
# DRGBT planner:
ros2 run sim_bringup test_dynamic_planning
# RRTx planner:
ros2 run sim_bringup test_dynamic_planning2
Sim_demo4.mp4
Sim_demo4_v2.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run sim_bringup test_dynamic_task_planning
For more details about these senarios and for comparison of DRGBT with RRTx and MARS methods, see video.
First, launch the cameras:
cd ~/xarm6-etf-lab
ros2 launch real_bringup realsense_etflab.launch.py
# or just type:
make cameras
Two cameras scan the environment, and after combining point clouds from the cameras (left and right one), a combined point cloud is read from pointcloud_combined topic. Then, obstacles are published at a rate of cca. 25 [Hz] to objects_cloud topic after their segmentation is done. These topic are created in perception_etflab library.
Note: For each test file in the sequel (which is located in apps folder), there is a corresponding yaml file withing data folder, where all necessary configurations can be set.
First, launch the robot:
cd ~/xarm6-etf-lab
ros2 launch real_bringup real_xarm6_etflab.launch.py
# or just type:
make real
# In the new tab type:
cd ~/xarm6-etf-lab
# In C++
ros2 run real_bringup test_move_xarm6
# In Python
ros2 run real_bringup test_move_xarm6.py
Within the function real_bringup::MoveXArm6Node::moveXArm6Callback(), you can test different robot modes as explained here using xarm_api from here.
Real_demo1.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_demo1
Real_demo2.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_demo2
Real_demo3.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_demo3
Real_demo4.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_planners
Real_demo5.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_task_planning
Real_demo6.mp4
First, open the file xarm6-etf-lab/src/external_modules/xarm_ros2/xarm_controller/config/xarm6_controllers.yaml. It is recommended to set controller_manager/ros__parameters/update_rate to 500 [Hz] and xarm6_traj_controller/ros__parameters/state_publish_rate to 250 [Hz].
Note: For each test file in the sequel (which is located in apps folder), there is a corresponding yaml file withing data folder, where all necessary configurations can be set.
Now, launch the robot:
cd ~/xarm6-etf-lab
ros2 launch real_bringup real_xarm6_etflab.launch.py
# or just type:
make real
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_dynamic_planning
Scenario1.mp4
Scenario2.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_dynamic_task_planning
Scenario3.mp4
Scenario4.mp4
Scenario5.mp4
For more details about these senarios see video.