Skip to content

Conversation

@MPKonst
Copy link

@MPKonst MPKonst commented May 15, 2024

This PR adds a new notebook with a guided puzzle to implement the forward pass of FlashAttention2.

It also makes some modifications to the visualiser to allow it to render the solution (example solution for testing can be found here). The main modifications needed were:

  • adding support for more binary operations on Scalar and ScalarHistory
  • monkeypatching math.exp
  • allowing for more colours, so examples with more TPB can be plotted.

The other changes to lib.py are mostly cosmetic.

"outputs": [],
"source": [
"%pip install -qqq git+https://github.com/chalk-diagrams/planar git+https://github.com/danoneata/chalk@srush-patch-1\n",
"!wget -q https://github.com/srush/GPU-Puzzles/raw/main/robot.png https://github.com/MPKonst/GPU-Puzzles/raw/flash_attn_puzzle/flash_attn_forward_algo.png https://github.com/MPKonst/GPU-Puzzles/raw/flash_attn_puzzle/lib.py"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remind to self: Change urls before merge.

"metadata": {},
"source": [
"\n",
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/MPKonst/GPU-Puzzles/blob/flash_attn_puzzle/Flash_attention_puzzle.ipynb)"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remind to self: change url before merge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant