-
Notifications
You must be signed in to change notification settings - Fork 43
Description
Hello @Pandoro ,
I am trying to use the drspaam ros code and I am having some issues with visualization of the detections.
Namely, they look like this:

The purple circles representing detections seem to be misaligned.
The visualizations that are created during eval and training seem to work fine.

What drew my attention is this piece of code in the ros visualization method, i.e. detections_to_rviz_marker :
# circle r = 0.4 ang = np.linspace(0, 2 * np.pi, 20) xy_offsets = r * np.stack((np.cos(ang), np.sin(ang)), axis=1)
this is different to the way x,y coordinates are obtained in the train and eval code, i.e.
def rphi_to_xy(r, phi): return r * np.cos(phi), r * np.sin(phi)
with phi defined as:
def get_laser_phi(angle_inc=np.radians(0.5), num_pts=552): laser_fov = (num_pts - 1) * angle_inc return np.linspace(-laser_fov * 0.5, laser_fov * 0.5, num_pts)
From what I see the ros viz code is using a lower resolution of only 20 scan points and a hardcoded r value.
Do you think this could be the issue of the misaligned visualization?
Thank you