Skip to content

Commit 7136cc3

Browse files
committed
Remove duplicate triton dependency
1 parent d0187f3 commit 7136cc3

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

requirements-cuda.txt

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,4 +13,3 @@ bitsandbytes==0.46.0 # bitsandbytes for 8-bit optimizers and weight quantization
1313
https://github.com/zzlol63/flash-attention-prebuild-wheels/releases/download/v0.2/flash_attn-2.8.2+cu128torch2.8-cp310-cp310-win_amd64.whl; sys_platform == "win32" and python_version == "3.10"
1414
https://github.com/zzlol63/flash-attention-prebuild-wheels/releases/download/v0.2/flash_attn-2.8.2+cu128torch2.8-cp311-cp311-win_amd64.whl; sys_platform == "win32" and python_version == "3.11"
1515
https://github.com/zzlol63/flash-attention-prebuild-wheels/releases/download/v0.2/flash_attn-2.8.2+cu128torch2.8-cp312-cp312-win_amd64.whl; sys_platform == "win32" and python_version == "3.12"
16-
triton-windows==3.4.0.post20; sys_platform == "win32"

0 commit comments

Comments
 (0)