Skip to content

Conversation

kinjalpatel27
Copy link
Contributor

What does this PR do?

Type of change: ?

Overview: ?

Usage

# Add a code snippet demonstrating how to use this

Testing

Before your PR is "Ready for review"

  • Make sure you read and follow Contributor guidelines and your commits are signed.
  • Is this change backward compatible?: Yes
  • Did you write any new necessary tests?: Yes
  • Did you add or update any necessary documentation?: -
  • Did you update Changelog?: Not yet

Additional Information

Copy link

copy-pr-bot bot commented Oct 7, 2025

Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually.

Contributors can view more details about this message here.

Copy link

coderabbitai bot commented Oct 7, 2025

Important

Review skipped

Draft detected.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch kinjal/grouped_linear

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

codecov bot commented Oct 7, 2025

Codecov Report

❌ Patch coverage is 75.00000% with 4 lines in your changes missing coverage. Please review.
✅ Project coverage is 73.77%. Comparing base (cb44c55) to head (22bfe0e).
⚠️ Report is 6 commits behind head on main.

Files with missing lines Patch % Lines
modelopt/torch/quantization/model_calib.py 75.00% 3 Missing ⚠️
modelopt/torch/utils/distributed.py 75.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #403      +/-   ##
==========================================
- Coverage   73.79%   73.77%   -0.03%     
==========================================
  Files         171      171              
  Lines       17591    17603      +12     
==========================================
+ Hits        12982    12986       +4     
- Misses       4609     4617       +8     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Comment on lines +93 to +95
if parallel_state.expert_tensor_parallel_group is not None:
quantizer.sync_amax_across_distributed_group(
parallel_state.expert_tensor_parallel_group
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tensor parallel sync here is not handled correctly across various cases. See the comments before sync_quantizer_amax_across_tp for more details

@kinjalpatel27 kinjalpatel27 force-pushed the kinjal/grouped_linear branch from 963657d to 22bfe0e Compare October 7, 2025 22:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants