-
Notifications
You must be signed in to change notification settings - Fork 710
Re-enable optimized gelu test in CMake #8597
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Re-enable optimized gelu test in CMake #8597
Conversation
I missed this line disabling the test. (Splitting out re-enable of log_softmax because I think that one needs fixes.) Differential Revision: [D69929122](https://our.internmc.facebook.com/intern/diff/D69929122/) [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/8597
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 5168a78 with merge base 745be4e ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D69929122 |
I missed this line disabling the test. (Splitting out re-enable of log_softmax because I think that one needs fixes.) Differential Revision: [D69929122](https://our.internmc.facebook.com/intern/diff/D69929122/) ghstack-source-id: 267434614 Pull Request resolved: #8597
I missed this line disabling the test. (Splitting out re-enable of log_softmax because I think that one needs fixes.) Differential Revision: [D69929122](https://our.internmc.facebook.com/intern/diff/D69929122/) [ghstack-poisoned]
|
This pull request was exported from Phabricator. Differential Revision: D69929122 |
I missed this line disabling the test. (Splitting out re-enable of log_softmax because I think that one needs fixes.) Differential Revision: [D69929122](https://our.internmc.facebook.com/intern/diff/D69929122/) [ghstack-poisoned]
|
This pull request was exported from Phabricator. Differential Revision: D69929122 |
* Fix log_softmax along non-contiguous dim Pull Request resolved: #8595 #8382 certainly didn't fix this problem (and added it on x86), but I don't think it was correct on ARM prior to that either. Added a regression test. ghstack-source-id: 268149462 @exported-using-ghexport Differential Revision: [D69928884](https://our.internmc.facebook.com/intern/diff/D69928884/) * Re-enable optimized gelu test in CMake Pull Request resolved: #8597 I missed this line disabling the test. (Splitting out re-enable of log_softmax because I think that one needs fixes.) ghstack-source-id: 268149463 @exported-using-ghexport Differential Revision: [D69929122](https://our.internmc.facebook.com/intern/diff/D69929122/) --------- Co-authored-by: Scott Wolchok <[email protected]>
Pull Request resolved: pytorch/executorch#8597 I missed this line disabling the test. (Splitting out re-enable of log_softmax because I think that one needs fixes.) ghstack-source-id: 267443264 @exported-using-ghexport Differential Revision: [D69929122](https://our.internmc.facebook.com/intern/diff/D69929122/)
Stack from ghstack (oldest at bottom):
I missed this line disabling the test. (Splitting out re-enable of log_softmax because I think that one needs fixes.)
Differential Revision: D69929122