Commit 5e4d8dc
authored
Fix INT32 bias overflow in QOperator INT8 symmetric quantization by adjusting weight scale and requantizing (microsoft#25278)
### Overview
This PR introduces a critical fix for **QOperator INT8 symmetric
quantization** in ONNX Runtime. It addresses a situation where the
computed **bias scale** (`input_scale * weight_scale`) becomes too
small, leading to **int32 overflow** or **precision clipping** during
bias quantization.
### Problem
In symmetric quantization (i.e., zero_point = 0), the bias tensor is
quantized using a fixed-point scale:
**bias_scale = input_scale * weight_scale**
When this value is too small, the quantized int32 bias may exceed the
range of `int32`, causing saturation or significant quantization error.
This was observed to cause **>51% accuracy loss** in some models.
### Solution
This PR adds two new functions to mitigate this:
---
#### 🔧 `_adjust_weight_scale_for_int32_bias(...)`
Located in `onnx_quantizer.py`, this function:
- **Inspects the float bias range** to compute the smallest valid bias
scale (based on int32 dynamic range)
- **Compares** this threshold against `input_scale * weight_scale`
- If too small, **scales up the weight scale** accordingly, to prevent
overflow
- Supports both per-tensor and per-channel weight quantization cases
This logic is **only triggered when**:
- The weight's zero point is exactly zero (i.e. symmetric)
- The weight data type is `INT8` or `INT16`
---
#### 🔄 `_requantize_weight(...)`
After weight scale adjustment, this function:
- **Finds the original quantized weight** (`q_weight`), scale, and zero
point from the initializer list
- **Removes** the outdated quantized weight and scale
- **Re-quantizes** the original float weights using the new scale and
the same zero point
- **Re-inserts** them into the model to maintain consistency
---
### Summary of Benefits
- ✅ Prevents int32 overflow or saturation during symmetric bias
quantization
- ✅ Ensures weight and bias quantization remain consistent
- ✅ Reduced quantization error from >51.4% to ~3% in test models
- ✅ Fix is limited in scope to QOperator + symmetric INT8/INT16 flow
(safe for other modes)
- ✅ Improves robustness of static quantization for hardware that
performs integer-only inference
---
### Code Location
- `onnxruntime/quantization/onnx_quantizer.py`
- `def _adjust_weight_scale_for_int32_bias(...)`
- `def _requantize_weight(...)`
- Integrated in `quantize_bias_static(...)`
---
Please let me know if you'd like additional test coverage or integration
points. Thanks!1 parent 97ccf3f commit 5e4d8dc
File tree
2 files changed
+260
-0
lines changed- onnxruntime
- python/tools/quantization
- test/python/quantization
2 files changed
+260
-0
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
28 | 28 | | |
29 | 29 | | |
30 | 30 | | |
| 31 | + | |
31 | 32 | | |
32 | 33 | | |
33 | 34 | | |
| |||
635 | 636 | | |
636 | 637 | | |
637 | 638 | | |
| 639 | + | |
| 640 | + | |
| 641 | + | |
| 642 | + | |
| 643 | + | |
| 644 | + | |
| 645 | + | |
| 646 | + | |
| 647 | + | |
| 648 | + | |
| 649 | + | |
| 650 | + | |
| 651 | + | |
| 652 | + | |
| 653 | + | |
| 654 | + | |
| 655 | + | |
| 656 | + | |
| 657 | + | |
| 658 | + | |
| 659 | + | |
| 660 | + | |
| 661 | + | |
| 662 | + | |
| 663 | + | |
| 664 | + | |
| 665 | + | |
| 666 | + | |
| 667 | + | |
| 668 | + | |
| 669 | + | |
| 670 | + | |
| 671 | + | |
| 672 | + | |
| 673 | + | |
| 674 | + | |
| 675 | + | |
| 676 | + | |
| 677 | + | |
| 678 | + | |
| 679 | + | |
| 680 | + | |
| 681 | + | |
| 682 | + | |
| 683 | + | |
| 684 | + | |
| 685 | + | |
| 686 | + | |
| 687 | + | |
| 688 | + | |
| 689 | + | |
| 690 | + | |
| 691 | + | |
| 692 | + | |
| 693 | + | |
| 694 | + | |
| 695 | + | |
| 696 | + | |
| 697 | + | |
| 698 | + | |
| 699 | + | |
| 700 | + | |
| 701 | + | |
| 702 | + | |
| 703 | + | |
| 704 | + | |
| 705 | + | |
| 706 | + | |
| 707 | + | |
| 708 | + | |
| 709 | + | |
| 710 | + | |
| 711 | + | |
| 712 | + | |
| 713 | + | |
| 714 | + | |
| 715 | + | |
| 716 | + | |
| 717 | + | |
| 718 | + | |
| 719 | + | |
| 720 | + | |
| 721 | + | |
| 722 | + | |
| 723 | + | |
| 724 | + | |
| 725 | + | |
| 726 | + | |
| 727 | + | |
| 728 | + | |
| 729 | + | |
| 730 | + | |
| 731 | + | |
| 732 | + | |
| 733 | + | |
| 734 | + | |
| 735 | + | |
| 736 | + | |
| 737 | + | |
| 738 | + | |
| 739 | + | |
| 740 | + | |
| 741 | + | |
| 742 | + | |
| 743 | + | |
| 744 | + | |
| 745 | + | |
| 746 | + | |
| 747 | + | |
| 748 | + | |
| 749 | + | |
| 750 | + | |
| 751 | + | |
| 752 | + | |
| 753 | + | |
| 754 | + | |
| 755 | + | |
| 756 | + | |
| 757 | + | |
| 758 | + | |
| 759 | + | |
| 760 | + | |
| 761 | + | |
| 762 | + | |
| 763 | + | |
| 764 | + | |
| 765 | + | |
| 766 | + | |
| 767 | + | |
| 768 | + | |
| 769 | + | |
638 | 770 | | |
639 | 771 | | |
640 | 772 | | |
| |||
660 | 792 | | |
661 | 793 | | |
662 | 794 | | |
| 795 | + | |
| 796 | + | |
| 797 | + | |
| 798 | + | |
| 799 | + | |
| 800 | + | |
| 801 | + | |
| 802 | + | |
| 803 | + | |
| 804 | + | |
| 805 | + | |
| 806 | + | |
| 807 | + | |
| 808 | + | |
| 809 | + | |
| 810 | + | |
| 811 | + | |
| 812 | + | |
| 813 | + | |
| 814 | + | |
| 815 | + | |
| 816 | + | |
| 817 | + | |
663 | 818 | | |
664 | 819 | | |
665 | 820 | | |
| |||
Lines changed: 105 additions & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
0 commit comments