Skip to content

Conversation

@kirklandsign
Copy link
Contributor

This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #6392
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/SS-JIA/120/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/SS-JIA/120/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/main
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/SS-JIA/120/orig

@pytorch-bot
Copy link

pytorch-bot bot commented Oct 21, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6423

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 64a19ab with merge base 0309854 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 21, 2024
Pull Request resolved: #6392

## Context

As title; introduces a custom op to calculate rotary positional embeddings in LLMs. The custom op achieves the same result as the `apply_rotary_emb` Python function. Please see the documentation comments in the shader for more details.
ghstack-source-id: 249175725
@exported-using-ghexport

Differential Revision: [D64697588](https://our.internmc.facebook.com/intern/diff/D64697588/)
@kirklandsign kirklandsign merged commit 10f51b9 into main Oct 21, 2024
6 checks passed
@kirklandsign kirklandsign deleted the gh/SS-JIA/120/orig branch October 21, 2024 23:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants