Skip to content

Commit 605bf88

Browse files
committed
fix ci; specify flash-attn version
1 parent 8be7ea8 commit 605bf88

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

.github/workflows/run_chatgpt_examples.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,10 @@ jobs:
3333
run: |
3434
pip install --no-cache-dir -v -e .
3535
36+
- name: Install flash-attention
37+
run: |
38+
pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
39+
# https://release-assets.githubusercontent.com/github-production-release-asset/494232964/eea5ccc0-9cb8-48d3-ac9d-049198a38582?sp=r&sv=2018-11-09&sr=b&spr=https&se=2025-11-11T11%3A24%3A12Z&rscd=attachment%3B+filename%3Dflash_attn-2.7.4.post1%2Bcu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl&rsct=application%2Foctet-stream&skoid=96c2d410-5711-43a1-aedd-ab1947aa7ab0&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skt=2025-11-11T10%3A23%3A28Z&ske=2025-11-11T11%3A24%3A12Z&sks=b&skv=2018-11-09&sig=PBlHLl81QvE2jqPG%2FFFmmwW5rUBiKKZCiponNJIUnoU%3D&jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmVsZWFzZS1hc3NldHMuZ2l0aHVidXNlcmNvbnRlbnQuY29tIiwia2V5Ijoia2V5MSIsImV4cCI6MTc2Mjg2MDc1OCwibmJmIjoxNzYyODU3MTU4LCJwYXRoIjoicmVsZWFzZWFzc2V0cHJvZHVjdGlvbi5ibG9iLmNvcmUud2luZG93cy5uZXQifQ.ZB3u9Nzw0xCKvUCbpTXXku8fPsNRcq65pMztucY0N9g&response-content-disposition=attachment%3B%20filename%3Dflash_attn-2.7.4.post1%2Bcu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl&response-content-type=application%2Foctet-stream
3640
- name: Install ChatGPT
3741
env:
3842
CFLAGS: "-O1"
@@ -43,10 +47,6 @@ jobs:
4347
pip install --no-cache-dir -v .
4448
pip install --no-cache-dir -r examples/requirements.txt
4549
46-
# - name: Install flash-attention
47-
# run: |
48-
# pip install flash-attn==2.7.4.post1 --no-build-isolation
49-
5050
# - name: Install Transformers
5151
# run: |
5252
# pip install --no-cache-dir transformers==4.36.2

0 commit comments

Comments
 (0)