Skip to content

Commit 9b3214a

Browse files
authored
Update comment
1 parent 9fdecf5 commit 9b3214a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/platforms/rocm.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -200,7 +200,7 @@ class RocmPlatform(Platform):
200200
"petit_nvfp4",
201201
"torchao",
202202
]
203-
# bitsandbytes is not supported on GPUs with warp size 64 (gfx9)
203+
# bitsandbytes quantization not supported on Instinct (warp size 64 limitation)
204204
if not on_gfx9():
205205
supported_quantization += ["bitsandbytes"]
206206

0 commit comments

Comments
 (0)