Skip to content

Commit cf1f9fb

Browse files
docs: add flash debug steps to docs (foundation-model-stack#510)
Signed-off-by: Mehant Kammakomati <[email protected]> Co-authored-by: Dushyant Behl <[email protected]>
1 parent e3f0cc1 commit cf1f9fb

File tree

1 file changed

+8
-0
lines changed

1 file changed

+8
-0
lines changed

README.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,14 @@ pip install fms-hf-tuning[flash-attn]
4646
```
4747
[FlashAttention](https://github.com/Dao-AILab/flash-attention) requires the [CUDA Toolit](https://developer.nvidia.com/cuda-toolkit) to be pre-installed.
4848

49+
*Debug recommendation:* While training, if you encounter flash-attn errors such as `undefined symbol`, you can follow the below steps for clean installation of flash binaries. This may occur when having multiple environments sharing the pip cache directory or torch version is updated.
50+
51+
```
52+
pip uninstall flash-attn
53+
pip cache purge
54+
pip install fms-hf-tuning[flash-attn]
55+
```
56+
4957
### Using FMS-Acceleration
5058

5159
If you wish to use [fms-acceleration](https://github.com/foundation-model-stack/fms-acceleration), you need to install it.

0 commit comments

Comments
 (0)