Skip to content

Actions: HKUSTDial/flash-sparse-attention

Actions

Auto assign reviewers and assignees

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
86 workflow runs
86 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Fix documentation and references for Flash Sparse Attention
Auto assign reviewers and assignees #39: Pull request #207 opened by LoserCheems
12s
Refactor attention block smoothing for consistency
Auto assign reviewers and assignees #37: Pull request #205 opened by LoserCheems
23s
Add selectable masking strategies for attention
Auto assign reviewers and assignees #36: Pull request #204 opened by LoserCheems
1h 51m 25s
Add block-wise smoothing to attention mask
Auto assign reviewers and assignees #34: Pull request #201 opened by LoserCheems
11s
Fix attention bias calculation and dbias handling
Auto assign reviewers and assignees #32: Pull request #199 opened by LoserCheems
17s
Update documentation to use mask utility in examples
Auto assign reviewers and assignees #31: Pull request #198 opened by LoserCheems
20s
[FEATURE SUPPORT] Centralize dynamic mask creation for FDMA
Auto assign reviewers and assignees #30: Pull request #197 opened by LoserCheems
19s
Enhance bias gradient accumulation in backward pass
Auto assign reviewers and assignees #28: Pull request #193 opened by LoserCheems
11s
Add issue/PR templates
Auto assign reviewers and assignees #22: Pull request #186 opened by LoserCheems
11s
Fix INF issue in bf16 backward pass with safer value clamping
Auto assign reviewers and assignees #18: Pull request #181 opened by Copilot AI
7s
Refactor attention mask and bias handling for efficiency
Auto assign reviewers and assignees #16: Pull request #177 opened by LoserCheems
15h 18m 31s
Bump version to 1.1.8
Auto assign reviewers and assignees #15: Pull request #176 opened by LoserCheems
51s
Increase GitHub Actions build timeout to 6 hours
Auto assign reviewers and assignees #14: Pull request #175 opened by LoserCheems
2m 37s