Skip to content

3. Software Setup

Raf edited this page May 28, 2019 · 7 revisions

Software

Software will be by far the most tedious part of this whole setup. Everything in terms of setting up the software and getting it to run will be written here and after this section further sections in the wiki will define how to modify the software and tune everything correctly. If you would like you can skip ahead and do the tuning now or do it after.

Installing The Workspace

The first thing you'll want to do is cd into your SSD in my case cd /xavier_ssd/ and clone the repo to there. To do this do git clone https://github.com/Rafcin/Echo-Rover.git and wait for it to download.

Next you need to build the whole workspace so you'll run catkin build . where the dot represents all items like * on ubuntu represents all. Now keep in mind instead of using catkin_make like most people do, we will use catkin build becuase it works much better and is able to tell you more about what you compile. If you want to know more about catkin build refer to here and read the cheatsheet.

Once you have the project built and it has compiled correctly you can then launch the system. Anytime you open a new terminal instance in your project home directory you need to run the command source devel/setup.bashand your good to go then to launch your files.

To start the robot you can run roslaunch edgebot_bringup main.launch this file will launch the whole system. This launch file contains all the other launch files that will activate the appropriate nodes. If you open the launch file you will also find that is laid out neatly to allow you to simply disable and enable certain nodes.

  <!-- ON|OFF Rtabmap -->
  <arg name="rtab"         default="true"  />
  <!-- ON|OFF Lidar -->
  <arg name="motors"       default="true"  />
  <!-- ON|OFF Localization -->
  <arg name="tele"         default="true"  />
  <!-- ON|OFF Move -->
  <arg name="move"         default="true"  />
  <!-- ON|OFF Exploration -->
  <arg name="explore"      default="false"  />
  <!-- ON|OFF Zed -->
  <arg name="zed"          default="false"  />
  <!-- Find Objects -->
  <arg name="findobjects"  default="true" />
  <!-- Explore Lite -->
  <arg name="exlite"       default="true" />
  <!-- RPLIDAR ON|OFF -->
  <arg name="lidar"        default="true" />
  <!-- OBJD ON|OFF -->
  <arg name="objd"         default="false" />

RTABMAP and ZED

In this whole project your best friend is the Stereolabs ZED Camera, this camera provides the odometry info you need to move and creates the map as well. In this segment we will be talking about using the ZED with RTABMAP. RTABMAP is a program written by Matlabbe which can be found here at the RTABMAP Github page.

RTABMAP (Real-Time Appearance Based Mapping) is a SLAM system that uses a bag of words approach to figure out if the last image comes from a new location or not. In this project we use RTABMAP with the ZED camera becuase the loop closure is much better than any other system like Hector, Gmapping or even Cartographer (With exceptions to Cartographer) and rarely loses tracking in large environments. The configuration we use for RTAB is as follows.

 <group ns="rtabmap">
   
    <node name="rtabmap" pkg="rtabmap_ros" type="rtabmap" output="screen" args="--delete_db_on_start">
          <param name="frame_id" type="string" value="base_link"/>

          <param name="subscribe_depth" type="bool" value="true"/>
          <param name="subscribe_rgbd" type="bool" value="false"/>
          <param name="subscribe_scan" type="bool" value="false"/>
          <remap from="odom" to="/$(arg zed_namespace)/$(arg zed_node_name)/odom"/>
          <remap from="scan" to="/scan"/>
          <remap from="rgb/image"       to="/$(arg zed_namespace)/$(arg zed_node_name)/rgb/image_rect_color"/>
          <remap from="depth/image"     to="/$(arg zed_namespace)/$(arg zed_node_name)/depth/depth_registered"/>
          <remap from="rgb/camera_info" to="/$(arg zed_namespace)/$(arg zed_node_name)/rgb/camera_info"/>
          <remap from="rgb/camera_info" to="/$(arg zed_namespace)/$(arg zed_node_name)/rgb/camera_info"/>

          <param name="approx_sync"       value="true"/> 

          <param name="queue_size" type="int" value="5"/>

          <param name="use_action_for_goal" type="bool" value="true"/>
          <param name="map_negative_poses_ignored" type="bool" value="true"/>

          <param name="RGBD/ProximityBySpace"        type="string" value="true"/>   <!-- Local loop closure detection (using estimated position) with locations in WM -->
	        <param name="RGBD/OptimizeFromGraphEnd"    type="string" value="true"/>  <!-- Set to false to generate map correction between /map and /odom -->
	        <param name="Kp/MaxDepth"                  type="string" value="12.0"/>
	        <param name="Reg/Strategy"                 type="string" value="1"/>      <!-- Loop closure transformation: 0=Visual, 1=ICP, 2=Visual+ICP -->
	        <param name="Icp/CorrespondenceRatio"      type="string" value="0.3"/>
	        <param name="Vis/MinInliers"               type="string" value="15"/>      <!-- 3D visual words minimum inliers to accept loop closure -->
	        <param name="Vis/InlierDistance"           type="string" value="0.1"/>    <!-- 3D visual words correspondence distance -->
	        <param name="RGBD/AngularUpdate"           type="string" value="0.1"/>    <!-- Update map only if the robot is moving -->
	        <param name="RGBD/LinearUpdate"            type="string" value="0.1"/>    <!-- Update map only if the robot is moving -->
	        <param name="RGBD/ProximityPathMaxNeighbors" type="string" value="0"/> 
	        <param name="Rtabmap/TimeThr"              type="string" value="0"/>
	        <param name="Mem/RehearsalSimilarity"      type="string" value="0.30"/>
	        <param name="Reg/Force3DoF"                type="string" value="true"/>
	        <param name="GridGlobal/MinSize"           type="string" value="20"/>
    </node>
  </group>

EDGE TPU & ZED DETECTION

In this project you have the option to use Google's Edge TPU device. This device was created to run tensorflow lite models at high speeds efficiently. In this case we will use the device to speed up our detections with the zed camera.

Clone this wiki locally