Skip to content

[Transform] [Utils] Support precision, add torch dtype validation #414

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Aug 11, 2025

Conversation

kylesayrs
Copy link
Contributor

Purpose

  • Support configuring the precision at which transforms are applied, which seems to have some minor effects on results
  • Add TorchDtype type annotation for adding torch dtypes to model definitions

Changes

  • Added precision argument to TransformSchemes
    • Transform weights are constructed using this precision
    • Transform weights are applied using this precision
      • This precision is used for both fusing operations and online transforms
  • Added TorchDtype type annotation in src/utils/type.py
    • Supports loading from torch.xxx or xxx strings and torch.dtypes

Testing

  • Added tests for TorchDtype type annotation
  • Tested with different precisions and found torch.float32 to be acceptable

kylesayrs

This comment was marked as outdated.

Copy link
Collaborator

@dsikka dsikka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just a few questions - lgtm otherwise

Signed-off-by: Kyle Sayers <[email protected]>
Copy link
Contributor

@brian-dellabetta brian-dellabetta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the updates to merged precision!

@kylesayrs kylesayrs merged commit 131673e into main Aug 11, 2025
1 check passed
@kylesayrs kylesayrs deleted the kylesayrs/transform-precision branch August 11, 2025 15:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants