Replies: 1 comment
-
duplicated |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi I follow the installation description by running pip3 install --no-build-isolation axolotl[flash-attn,deepspeed]
However I got the following errors:
creating build/temp.linux-x86_64-cpython-311/csrc/flash_attn/src
g++ -pthread -B /apps/all/Anaconda3/2024.02-1/compiler_compat -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /apps/all/Anaconda3/2024.02-1/include -fPIC -O2 -isystem /apps/all/Anaconda3/2024.02-1/include -fPIC -I/tmp/pip-install-9yb5i8_p/flash-attn_8dbb7850761f4b1bb3651d19640b6940/csrc/flash_attn -I/tmp/pip-install-9yb5i8_p/flash-attn_8dbb7850761f4b1bb3651d19640b6940/csrc/flash_attn/src -I/tmp/pip-install-9yb5i8_p/flash-attn_8dbb7850761f4b1bb3651d19640b6940/csrc/cutlass/include -I/home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include -I/home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/torch/csrc/api/include -I/usr/local/cuda/include -I/apps/all/Anaconda3/2024.02-1/include/python3.11 -c csrc/flash_attn/flash_api.cpp -o build/temp.linux-x86_64-cpython-311/csrc/flash_attn/flash_api.o -O3 -std=c++17 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1016" -DTORCH_EXTENSION_NAME=flash_attn_2_cuda -D_GLIBCXX_USE_CXX11_ABI=1
In file included from /home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/ATen/core/TensorBase.h:14,
from /home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/ATen/core/TensorBody.h:38,
from /home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/ATen/core/Tensor.h:3,
from /home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/torch/csrc/utils/variadic.h:3,
from /home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/torch/csrc/api/include/torch/detail/static.h:3,
from /home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/torch/csrc/api/include/torch/python.h:3,
from csrc/flash_attn/flash_api.cpp:6:
/home/it4i-chang505/.local/lib/python3.11/site-packages/torch/include/c10/util/C++17.h:13:2: error: #error "You're trying to build PyTorch with a too old version of GCC. We need GCC 9 or later."
#error
^~~~~
error: command '/usr/bin/g++' failed with exit code 1
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for flash-attn
Running setup.py clean for flash-attn
Building wheel for axolotl (pyproject.toml) ... done
Created wheel for axolotl: filename=axolotl-0.9.1-py3-none-any.whl size=411751 sha256=43b0a585e9da6304d8857c4c9bc23ef3d1983c688de6256a6e9de767bd287f69
Stored in directory: /home/it4i-chang505/.cache/pip/wheels/74/91/25/05a35baf6dfc358b6ac11eb88a7c9b9a0177ae8f3e59980a77
Successfully built axolotl
Failed to build flash-attn
ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects
Can someone help please? I am using 2 * NVIDIA A100 GPU, 40 VRAM each. CUDA version 12.8. Linux Ubuntu.
Beta Was this translation helpful? Give feedback.
All reactions