Skip to content

Commit 8e3709a

Browse files
build(deps): bump flash-attn from 2.6.3 to 2.8.2
Bumps [flash-attn](https://github.com/Dao-AILab/flash-attention) from 2.6.3 to 2.8.2. - [Release notes](https://github.com/Dao-AILab/flash-attention/releases) - [Commits](Dao-AILab/flash-attention@v2.6.3...v2.8.2) --- updated-dependencies: - dependency-name: flash-attn dependency-version: 2.8.2 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <[email protected]>
1 parent 516f857 commit 8e3709a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ dev = [
6363
flash_attn = [
6464
# it's easier to install flash-attn from wheel rather than like this as extra
6565
# "https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu118torch2.0cxx11abiFALSE-cp311-cp311-linux_x86_64.whl",
66-
"flash-attn==2.6.3",
66+
"flash-attn==2.8.2",
6767
"packaging", # FIXME: temporary, until https://github.com/Dao-AILab/flash-attention/pull/937 is released
6868
"ninja"
6969
]

0 commit comments

Comments
 (0)