Skip to content

Commit ed7e20b

Browse files
committed
Merge remote-tracking branch 'origin' into kylesayrs/transform-attention-head
2 parents df74532 + b163bd9 commit ed7e20b

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/compressed_tensors/quantization/lifecycle/initialize.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,7 +189,7 @@ def _initialize_scale_zero_point(
189189
else:
190190
# TODO: consider erroring out in the future as if the dtype if not one of these,
191191
# there is likely bug
192-
if scale_dtype not in [torch.float16, torch.bfloat16, torch.float32]:
192+
if scale_dtype not in [torch.float16, torch.bfloat16, torch.float32, torch.float64]:
193193
scale_dtype = torch.float16
194194
zp_dtype = quantization_args.pytorch_dtype()
195195

0 commit comments

Comments
 (0)