-
Notifications
You must be signed in to change notification settings - Fork 367
Pull requests: fla-org/flash-linear-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
[CP] fuse fwd/bwd kernels and fix IMA in long context
#733
opened Jan 30, 2026 by
zhiyuan1i
Loading…
[Fix][GDN] convert b_q dtype in bwd_dhu kernel when USE_G is enabled
help wanted
Extra attention is needed
#715
opened Jan 18, 2026 by
slowlyC
Loading…
[Deltaformer] kernel improvement; if-else optimization; change w to fp32; add 1e-9 to avoid nan
#603
opened Sep 30, 2025 by
foreverpiano
Loading…
[WIP] Support chunk-parallel via combined
chunk and fused_chunk
#505
opened Jul 2, 2025 by
yzhangcs
Loading…
ProTip!
Add no:assignee to see everything that’s not assigned.