Releases: dptech-corp/flash-attention
Releases · dptech-corp/flash-attention
odd-len-support
Add odd length support (#5) * add odd length support * add mask in attn_mask & attn_bias * rm useless files * move if to the outer loop * remove comments --------- Co-authored-by: xhj <jixh@dp.tech>
Attention mask & Attention bias support
refs/heads/workflow add .