Commit ebb9866
3 files changed
+2
-2
lines changedSubmodule flash-linear-attention updated 27 files
- README.md+34-22
- benchmarks/ops/benchmark_retention.py+9-29
- examples/training.md+34-23
- fla/layers/abc.py+11-3
- fla/layers/attn.py+4-3
- fla/layers/delta_net.py+2-2
- fla/layers/gated_deltanet.py+23-12
- fla/layers/gla.py+11-3
- fla/layers/hgrn.py+6-2
- fla/layers/lightnet.py+12-6
- fla/layers/multiscale_retention.py+11-3
- fla/layers/simple_gla.py+11-3
- fla/models/nsa/modeling_nsa.py+1-2
- fla/modules/__init__.py+4-1
- fla/modules/fused_norm_gate.py+379-201
- fla/ops/__init__.py+33-9
- fla/ops/abc/chunk.py+117-208
- fla/ops/based/fused_chunk.py+28-44
- fla/ops/based/parallel.py+78-77
- fla/ops/delta_rule/parallel.py+53-61
- fla/ops/linear_attn/fused_chunk.py+15-33
- fla/ops/rebased/naive.py+15-37
- fla/ops/simple_gla/parallel.py+5-2
- fla/ops/utils/logcumsumexp.py+2-5
- pyproject.toml+1-1
- setup.py+1-1
- tests/modules/test_layernorm_gated.py+81
File renamed without changes.
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
102 | 102 | | |
103 | 103 | | |
104 | 104 | | |
105 | | - | |
| 105 | + | |
106 | 106 | | |
107 | 107 | | |
108 | 108 | | |
| |||
0 commit comments