We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent 55f2376 commit ac4219eCopy full SHA for ac4219e
README.md
@@ -28,10 +28,9 @@ Install the latest version of fla
28
pip uninstall flash-linear-attention && pip install -U --no-use-pep517 git+https://github.com/fla-org/flash-linear-attention
29
```
30
31
-`flame` manages minimal dependencies, only including `lm-evaluation-harness` and `torchtitan` as submodules.
32
-After installation, initialize and update the submodules:
33
-```sh
34
-git submodule update --init --recursive
+[Important] Install specific version of torchtitan
+```
+pip install git+https://github.com/pytorch/torchtitan.git@5e2033c
35
36
37
torchtitan
0 commit comments