-
Notifications
You must be signed in to change notification settings - Fork 748
Fix ReLU fusion when conv/linear has > 1 user #6894
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6894
Note: Links to docs will display an error until the docs builds have been completed. ❗ 2 Active SEVsThere are 2 currently active SEVs. If your PR is affected, please view them below:
✅ No FailuresAs of commit 84cf62e with merge base 5b4d9bb ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D65989599 |
91b798d to
40e0d2a
Compare
Summary: X-link: pytorch/pytorch#140846 Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere. XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU. Reviewed By: digantdesai Differential Revision: D65989599
|
This pull request was exported from Phabricator. Differential Revision: D65989599 |
40e0d2a to
aa3cf7c
Compare
Summary: X-link: pytorch/pytorch#140846 Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere. XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU. Reviewed By: digantdesai Differential Revision: D65989599
Summary: X-link: pytorch/pytorch#140846 Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere. XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU. Reviewed By: digantdesai Differential Revision: D65989599
aa3cf7c to
84cf62e
Compare
|
This pull request was exported from Phabricator. Differential Revision: D65989599 |
1 similar comment
|
This pull request was exported from Phabricator. Differential Revision: D65989599 |
…ytorch#140846) Summary: X-link: pytorch/executorch#6894 Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere. XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU. Test Plan: CI Reviewed By: digantdesai Differential Revision: D65989599
Summary:
Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere.
XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU.
Reviewed By: digantdesai
Differential Revision: D65989599