Skip to content

Conversation

@TarzanZhao
Copy link
Collaborator

@TarzanZhao TarzanZhao commented Sep 18, 2025

This PR fixes issue 805

Summary

  1. This PR enables our point transformer battery to process multiple point clouds in a batch.
  2. It adds multi-orders features including z, z-trans, hilbert, hilbert-trans.
  3. It supports batch normalization on top of layer normalization.

@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Sep 18, 2025

CLA Signed

The committers listed above are authorized under a signed CLA.

@TarzanZhao TarzanZhao changed the title [WIP] Implement batched version of point transformer v3 battery Implement batched version of point transformer v3 battery Sep 26, 2025
@TarzanZhao TarzanZhao changed the title Implement batched version of point transformer v3 battery [WIP] Implement batched version of point transformer v3 battery Sep 26, 2025
Hexu Zhao added 2 commits November 7, 2025 19:43
…r different point cloud inputs. (2) add attention softmax scaling

Signed-off-by: Hexu Zhao <[email protected]>
Signed-off-by: Hexu Zhao <[email protected]>
Copy link
Contributor

@blackencino blackencino left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Let's focus on moving this forward to reproducibility.

@TarzanZhao TarzanZhao merged commit 9e33633 into main Nov 9, 2025
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants