We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 4b4fc96 commit 696936cCopy full SHA for 696936c
README.md
@@ -76,10 +76,9 @@ The numbers reported here were gathered using:
76
77
To install deps:
78
```
79
-pip uninstall diffusers -y && pip install git+https://github.com/huggingface/diffusers@b272807bc898a314cde536c1d7d1e43592af1fce
+pip install -U diffusers
80
pip install --pre torch==2.8.0.dev20250605+cu126 --index-url https://download.pytorch.org/whl/nightly/cu126
81
pip install --pre torchao==0.12.0.dev20250609+cu126 --index-url https://download.pytorch.org/whl/nightly/cu126
82
-pip install diffusers==0.33.1
83
84
85
To install flash attention v3, follow the instructions in https://github.com/Dao-AILab/flash-attention#flashattention-3-beta-release.
0 commit comments