Replies: 1 comment
-
https://einops.rocks/pytorch-examples.html under "Transformer's attention needs more attention" looks like showing the way but I still don't get it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Thanks to @C43H66N12O12S2 (#2967) xformers is working on pc but after 75 tokens not working on colab t4 I don't know why?
My question is, how to convert this einops rearrange() function to reshape() and permute()
q, k, v = map(lambda t: rearrange(t, 'b c h w -> b (h w) c'), (q, k, v))
Beta Was this translation helpful? Give feedback.
All reactions