Skip to content

Allow per-group quantizers in QuantOptimizer, fix state_dict #2743

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Aug 13, 2025
Merged

Conversation

lisjin
Copy link
Contributor

@lisjin lisjin commented Aug 12, 2025

Changes to make torchao's version of PARQ compatible with current QAT pipeline:

  1. Bug fix: When using FSDP 2, the QuantOptimizer state cannot be loaded correctly from a checkpoint because it has an extra "qat_state" key. Fix this by removing qat_state entirely and have the user set optimizer.num_steps manually after loading checkpoint.
  2. New feature: Allow param_groups to be quantized using different bit-widths and/or quantization methods. If the user passes a "quant_cls" and optional "quant_kwargs" keys in a param_group, QuantOptimizer will override the default self.quantizer object with one initialized from these values.
  3. New feature: Make PARQ compatible with new QAT API by relaxing the constraints in QATConfig. Before, it required that base_config or weight_config must be provided during the prepare step. Now, it accepts when only activation_config is provided. Now we can migrate from deprecated*IntXQuantizationAwareTrainingConfig classes.

@lisjin lisjin requested review from andrewor14 and metascroy August 12, 2025 14:13
@lisjin lisjin added the topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories) label Aug 12, 2025
Copy link

pytorch-bot bot commented Aug 12, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/2743

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit bed54ac with merge base 46ba24c (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 12, 2025
@lisjin lisjin force-pushed the lvj branch 2 times, most recently from 29281fd to 57e29ee Compare August 13, 2025 15:06
Copy link
Contributor

@andrewor14 andrewor14 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

QAT changes looks good to me. Thanks!

@andrewor14
Copy link
Contributor

Please also update the PR description to reflect the latest design.

@lisjin lisjin merged commit d86ae25 into main Aug 13, 2025
18 checks passed
@lisjin lisjin deleted the lvj branch August 13, 2025 22:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants