Skip to content

Flash and Linear Attention mechanisms added to the TabTransformer

Choose a tag to compare

@jrzaurin jrzaurin released this 06 Aug 10:50
2ef478c
  1. Added Flash Attention
  2. Added Linear Attention
  3. Revisited and polished the docs