You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now you should be able to place a *2D Goal Pose* in Rviz, somewhere on the map. The robot should plan, visualize and drive a route to this goal pose.
165
+
Now you should be able to place a *2D Goal Pose* in Rviz, somewhere on the map. The robot should plan, visualize and drive a route to this goal pose. If you have a prebuilt map and run without SLAM, **AMCL (Adaptive Monte Carlo Localization)** is used to localize the robot on the known map.
166
166
167
167
:::{note}
168
168
Don't forget to reset the E-STOP when the Panther should drive:
169
169
170
170
```bash
171
171
ros2 service call /panther/hardware/e_stop_reset std_srvs/srv/Trigger {}
172
172
```
173
-
174
173
:::
175
174
175
+
## Nav2 Choices and Configuration
176
+
177
+
The Panther uses the [Nav2](https://navigation.ros.org/) stack for navigation. Most parameters remain at their defaults, but a few key choices were made based on experiments with the real robot:
178
+
179
+
***Controller:**`vector_pursuit` – provided the most reliable path following in practice.
180
+
***Planner:**`SmacPlannerHybrid` – produced the most robust global plans during testing.
181
+
***Recovery:** uses Nav2’s default behaviors (*clear costmaps, spin, wait, backup*), with an extended **progress timeout of 40 s** so the robot won’t abort immediately when blocked by dynamic obstacles (e.g. vehicles).
182
+
***Costmaps:** both `publish_frequency` and `update_frequency` are set to **30 Hz** for smoother obstacle updates on the real robot in the global costmap.
183
+
***Waypoint follower:** a custom `waypoint_follower_controller` node was developed to follow waypoints provided by, for example, a UI.
184
+
***Collision Monitor:** filters velocity commands based on the robot’s current motion. It slows down when obstacles enter a rectangular slow-down zone and uses velocity-dependent stop polygons (forward, backward, rotation, idle) to enforce a full stop if an obstacle blocks its path.
During experimentation it became clear that at least ethernet-speed connections are required in order to deal with pointcloud data without dropping messages. The current high-level architecture when using a LiDAR is as follows:
220
-
221
-

222
-
223
-
A major advantage to this architecture is that it allows for fast and easy development, and straightforward inspection of any ROS topics using Rviz and/or ROSBoard. However, this makes it impossible to use wifi to operate the robot, making a cable mandatory. This provides obvious limitations for use-cases where further autonomy is required.
224
-
225
-
One proposed alternate setup wold be to move the RCDT docker container to the onboard ThinkStation. This would eliminate the need to broadcast pointcloud data over the network, instead keeping it all within the robot itself.
A downside to this approach, however, would be that live inspection of data generated by the robot becomes more complex. Lower-bandwith data such as a generated map or camera data would be no problem to consume using a simpler "monitor" docker container that only needs to run Rviz and/or ROSBoard.
230
-
231
-
Additionally, the RCDT docker could be extended to run foxglove. One of the promising features of foxglove is that it supports 3D data visualizations in-browser, and [supports compression of topics](https://docs.foxglove.dev/docs/connecting-to-data/ros-foxglove-bridge). It appears this compression would be lossless, but this has not been explored in detail, yet. [A simple intro to foxglove with some basic examples is provided in this blogpost](https://foxglove.dev/blog/installing-ros2-on-macos-with-docker). It is also worth noting that foxglove requires certain ports to be exposed in the docker container, [similar to how is dicussed here](https://www.reddit.com/r/docker/comments/1aq8v9b/ports_in_docker_and_docker_compose/). Finally, foxglove also provides a level of [integration with ROSBoard](https://discourse.ros.org/t/introducing-foxglove-integration-with-rosboard-for-real-time-visualizations/38376) allowing for some reuse of the ROSBoard features we have already set up.
232
-
233
-
Another approach could be to [create a (lossy) compressed topic](https://discourse.ros.org/t/compressed-pointcloud2/10616) for pointcloud data that would only serve inspection purposes; not SLAM/nav2 purposes. This would require further investigation into the current state of availability of these kinds of topics, though, since the thread linked here is a few years old.
234
-
235
-
It also bears mentioning that [Husarion provides a web UI](https://husarion.com/manuals/panther/software/ros2/robot-management/#webui), but this appears to be poorly-maintained. Additionally, it requires snap packages, which are tricky to install in a dockerized environment.
0 commit comments