-
Notifications
You must be signed in to change notification settings - Fork 722
update spinquant quantization options to be general purposed pre-quantization #5797
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/5797
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 3dc62c6 with merge base 6923ae5 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D63708762 |
|
This pull request was exported from Phabricator. Differential Revision: D63708762 |
…tization (pytorch#5797) Summary: Pull Request resolved: pytorch#5797 We decided to use the same quantization scheme and checkpoint format for QAT + LoRA. This PR updates related quantization cli options to be general purposed for pre-quantized checkpoints. Differential Revision: D63708762
513ea46 to
3d9e887
Compare
|
This pull request was exported from Phabricator. Differential Revision: D63708762 |
3d9e887 to
0ef6ec2
Compare
…tization (pytorch#5797) Summary: Pull Request resolved: pytorch#5797 We decided to use the same quantization scheme and checkpoint format for QAT + LoRA. This PR updates related quantization cli options to be general purposed for pre-quantized checkpoints. Differential Revision: D63708762
…tization (pytorch#5797) Summary: Pull Request resolved: pytorch#5797 We decided to use the same quantization scheme and checkpoint format for QAT + LoRA. This PR updates related quantization cli options to be general purposed for pre-quantized checkpoints. Differential Revision: D63708762
|
This pull request was exported from Phabricator. Differential Revision: D63708762 |
0ef6ec2 to
3dc62c6
Compare
|
This pull request has been merged in c48d867. |
…tization (#5797) Summary: Pull Request resolved: #5797 We decided to use the same quantization scheme and checkpoint format for QAT + LoRA. This PR updates related quantization cli options to be general purposed for pre-quantized checkpoints. Reviewed By: mergennachin Differential Revision: D63708762 fbshipit-source-id: 08d862d3331868616ff8dae5c5d86a2d36f26cd7
Differential Revision: D63708762