Skip to content

Conversation

@kirklandsign
Copy link
Contributor

@kirklandsign kirklandsign commented Oct 16, 2024

This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #6265
^ Please use this as the source of truth for the PR number to reference in comments
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/kirklandsign/12/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/kirklandsign/12/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/main
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/kirklandsign/12/orig

Original PR body:

    Per ghstack [tutorial](https://github.com/ezyang/ghstack/blob/master/README.md#structure-of-submitted-pull-requests) for a PR exported by ghstack, it will have two diff sets:
  • gh/user/1/base <- gh/user/1/head where the PR is created like this
  • main <- gh/user/1/orig

The purpose of this bot is, when the ghstack PR is merged, automatically create another PR to do this merge main <- gh/user/1/orig. Then we just merge the newly created PR so that main has that change.

Missing piece: token from pytorch bot to create PR.

If this goes well, we can either git merge the commit from gh/user/1/orig into main directly, without going through the new PR; or auto approve and merge the PR.

Test: You can test locally with export GITHUB_TOKEN=ghpxyz; python .github/scripts/propose_ghstack_orig_pr.py --pr 6265 --repo pytorch/executorch
Note that between /orig merges, there is never a merge conflict.

The invariant is guaranteed that gh/user/1/orig contains the exact same change between gh/user/1/base <- gh/user/1/head by ghstack tool.

Stack from ghstack (oldest at bottom):

@kirklandsign kirklandsign force-pushed the gh/kirklandsign/13/orig branch from c511b27 to b32d749 Compare October 16, 2024 04:24
@kirklandsign kirklandsign force-pushed the gh/kirklandsign/12/orig branch from 3319497 to 5d7fb0b Compare October 16, 2024 04:24
@pytorch-bot
Copy link

pytorch-bot bot commented Oct 16, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6271

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot
Copy link

pytorch-bot bot commented Oct 16, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6271

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit 647b1f2 with merge base 35aeaca (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 16, 2024
ghstack-source-id: ee1a65b
Pull Request resolved: #6265
ghstack-source-id: ab10bf9
Pull Request resolved: #6266
@kirklandsign kirklandsign force-pushed the gh/kirklandsign/12/orig branch from 5d7fb0b to e7f3b17 Compare October 16, 2024 04:24
@kirklandsign kirklandsign force-pushed the gh/kirklandsign/13/orig branch from b32d749 to 647b1f2 Compare October 16, 2024 04:24
@kirklandsign kirklandsign force-pushed the gh/kirklandsign/12/orig branch 2 times, most recently from 86ab0db to 6053a96 Compare October 16, 2024 05:27
@kirklandsign kirklandsign changed the title edit readme.md Script to ghstack land Oct 16, 2024
@kirklandsign kirklandsign changed the base branch from gh/kirklandsign/12/orig to main October 16, 2024 05:55
@kirklandsign kirklandsign deleted the gh/kirklandsign/13/orig branch October 17, 2024 23:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants