Skip to content

Releases: lucidrains/native-sparse-attention-pytorch

0.0.77

04 Mar 18:14

Choose a tag to compare

fixes for triton pathway

0.0.76

04 Mar 16:55

Choose a tag to compare

causal flag for the transformer and setting correct flag for flex att…

0.0.75

04 Mar 16:33

Choose a tag to compare

some progress towards non-causal variant

0.0.73

03 Mar 16:53

Choose a tag to compare

fix

0.0.72

03 Mar 16:39

Choose a tag to compare

initial forward needs to return cache with rotated keys

0.0.71

03 Mar 16:31

Choose a tag to compare

update NSA inference so rotated queries and keys are cached

0.0.70

01 Mar 17:53

Choose a tag to compare

fix some padding issues for gating with importance score

0.0.69

01 Mar 17:42

Choose a tag to compare

move the gating back onto the selected keys for improved differentiab…

0.0.68

01 Mar 16:07

Choose a tag to compare

some intermittent issue with flex attention on sample, just disable a…

0.0.66

01 Mar 15:22

Choose a tag to compare

fix intermittent issue with triton nsa dk