We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent e7d66ed commit a27bd7dCopy full SHA for a27bd7d
llm-complete-guide/README.md
@@ -43,7 +43,8 @@ environment and install the dependencies using the following command:
43
pip install -r requirements.txt
44
```
45
46
-blah blah if it fails FLASH_ATTENTION_SKIP_CUDA_BUILD=TRUE pip install flash-attn --no-build-isolation
+Depending on your setup you may run into some issues when running the pip install command with the
47
+`flash_attn` package. In that case running `FLASH_ATTENTION_SKIP_CUDA_BUILD=TRUE pip install flash-attn --no-build-isolation` could help you.
48
49
In order to use the default LLM for this query, you'll need an account and an
50
API key from OpenAI specified as another environment variable:
0 commit comments