You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.MD
+5-7Lines changed: 5 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,3 @@
1
-
```markdown
2
1
# Flash Attention Windows Wheels (Python 3.10)
3
2
4
3
Pre-built Windows wheels for [Flash-Attention 2](https://github.com/Dao-AILab/flash-attention) - The state-of-the-art efficient attention implementation for NVIDIA GPUs.
@@ -44,7 +43,7 @@ Note: These wheels are community-maintained and are not officially supported by
44
43
45
44
## Quick Installation
46
45
47
-
```bash
46
+
```sh
48
47
# Simply download the wheel file and install with:
0 commit comments