-
Notifications
You must be signed in to change notification settings - Fork 743
Support approximate gelu #11246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support approximate gelu #11246
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/11246
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit c9eb4e9 with merge base 1bc36c7 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng Differential Revision: D75454999
3f8a31d to
04a5137
Compare
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng Differential Revision: D75454999
04a5137 to
53451e1
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng Differential Revision: D75454999
53451e1 to
61a9077
Compare
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng Differential Revision: D75454999
61a9077 to
08a3287
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
08a3287 to
df0a556
Compare
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
df0a556 to
0bf196c
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
0bf196c to
e39fd13
Compare
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
e39fd13 to
4492248
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
4492248 to
cf1a530
Compare
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
cf1a530 to
e7405e8
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
e7405e8 to
c6af1dd
Compare
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
c6af1dd to
bf0504d
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
bf0504d to
35f95c1
Compare
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
605e1fb to
ac8662a
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
faff995 to
0465717
Compare
Summary: GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
Summary: Pull Request resolved: pytorch#11246 GELU accepts an `approximate` argument which is either `none` by default, or `tanh` When the `approximate` kwarg is present, decompose the op. We already have an existing test in test_aten_gelu_out to make sure the op is supported. Reviewed By: zonglinpeng, hsharma35 Differential Revision: D75454999
|
This pull request was exported from Phabricator. Differential Revision: D75454999 |
0465717 to
c9eb4e9
Compare
Summary:
GELU accepts an
approximateargument which is eithernoneby default, ortanhWhen the
approximatekwarg is present, decompose the op.We already have an existing test in test_aten_gelu_out to make sure the op is supported.
Differential Revision: D75454999