-
I am using Open3D to create mitsuba meshes which I then render for some scientific visualization. In doing so, I want to transfer the camera properties from Open3D to the mitsuba camera. I have tried using the view matrix from Open3D to perform a In some cases, by setting certain components negative I can center the object. However, after rotation my view i normally destroyed. This also seems to change for different meshes depending on where they are centered, as if the camera is not being put where I asked it to be. I have an issue on the Open3D github as well, but I thought I would also ask here in case someone had an idea of a good way to go between these two camera descriptions. Or just what kind of view matrix can I apply in the |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
When you set up the scene I mean if you import the mesh created by open3D you can try open3D to-world matrix set it into Mitsuba sensor to-world if not work may be the coordinates system was not between at this time ,you can simply matmul a E matrix which is a unit matrix M to sensor toworld and then you make M[2][2] and M[3][3] to be -1 |
Beta Was this translation helpful? Give feedback.
-
I'm unfamiliar with open3D. Let me move this over to the discussions, it seems more appropriate for this topic. |
Beta Was this translation helpful? Give feedback.
-
I managed to get chatgpt to solve the conversion for me. The solution is below: def _update_camera(self, view_matrix: np.ndarray) -> None:
"""
Update the camera to look at the mesh center.
Parameters
----------
view_matrix : np.ndarray
View matrix for the camera from open3d.
Notes
-----
This function updates the camera in the scene_dict.
It should be called before rendering.
"""
to_world_matrix = np.linalg.inv(view_matrix)
# Step 2: Adjust the coordinate system by flipping the Z-axis
z_flip_matrix = np.array(
[[-1, 0, 0, 0], [0, 1, 0, 0], [0, 0, -1, 0], [0, 0, 0, 1]]
)
adjusted_to_world_matrix = np.dot(to_world_matrix, z_flip_matrix)
self.scene_dict["sensor"]["to_world"] = mi.ScalarTransform4f(
adjusted_to_world_matrix
) |
Beta Was this translation helpful? Give feedback.
I managed to get chatgpt to solve the conversion for me. The solution is below: