Skip to content

Conversation

@mcr229
Copy link
Contributor

@mcr229 mcr229 commented Nov 15, 2024

Summary:
Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere.

XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU.

Reviewed By: digantdesai

Differential Revision: D65989599

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 15, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6894

Note: Links to docs will display an error until the docs builds have been completed.

❗ 2 Active SEVs

There are 2 currently active SEVs. If your PR is affected, please view them below:

✅ No Failures

As of commit 84cf62e with merge base 5b4d9bb (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 15, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65989599

mcr229 added a commit to mcr229/executorch that referenced this pull request Nov 15, 2024
Summary:
X-link: pytorch/pytorch#140846


Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere.

XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU.

Reviewed By: digantdesai

Differential Revision: D65989599
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65989599

mcr229 added a commit to mcr229/executorch that referenced this pull request Nov 18, 2024
Summary:
X-link: pytorch/pytorch#140846


Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere.

XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU.

Reviewed By: digantdesai

Differential Revision: D65989599
Summary:
X-link: pytorch/pytorch#140846


Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere.

XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU.

Reviewed By: digantdesai

Differential Revision: D65989599
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65989599

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65989599

mcr229 added a commit to mcr229/pytorch that referenced this pull request Nov 18, 2024
…ytorch#140846)

Summary:

X-link: pytorch/executorch#6894

Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere.

XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU.

Test Plan: CI

Reviewed By: digantdesai

Differential Revision: D65989599
@facebook-github-bot facebook-github-bot merged commit 04f6fcd into pytorch:main Nov 19, 2024
38 of 41 checks passed
@mcr229 mcr229 deleted the export-D65989599 branch July 25, 2025 22:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants