-
Notifications
You must be signed in to change notification settings - Fork 9
Open
Description
Hi, guys! I have found that there are two ways in the current repository (like file bench2drive_drivetransformer_dataset.py) to compute the world2lidar transformation matrix, but they are not equal.
1st way:
lidar2ego = cur_frame['sensors']['LIDAR_TOP']['lidar2ego']
ego2world = np.eye(4)
ego2world[0:2,3] = cur_frame['ego_translation'][0:2]
ego2world[0:3,0:3] = Quaternion(axis=[0, 0, 1], radians=cur_frame['ego_yaw']).rotation_matrix
lidar2world = ego2world @ lidar2ego
world2lidar_lidar_cur = self.invert_pose(lidar2world)2nd way:
world2lidar_lidar_cur=cur_frame['sensors']['LIDAR_TOP']['world2lidar']However, there is some error between both, which I think is a little large(like 0.6m). Below is an example of the minus result I computed for one frame:
Can you explain more about this?
Metadata
Metadata
Assignees
Labels
No labels