Skip to content

Conversation

jerryzh168
Copy link
Contributor

No description provided.

Copy link

pytorch-bot bot commented Oct 15, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3183

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit acfd12d with merge base ff16308 (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 15, 2025
x = torch.randn(128, 32, device="cuda", dtype=torch.bfloat16)
y_ref = m(x)
y_mx = m_mx(x)
with torch.inference_mode():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this what makes the test fail, without changes to the product code?

return _addmm_mx_dispatch(a, b, func, bias=bias)

@implements([aten.linear.default])
def mx_linear(func, types, args, kwargs):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the reshape should live here, the addmm op is supposed to have 2d inputs

a = args[0]
b = args[1]
bias = args[2] if len(args) > 2 else None
return _addmm_mx_dispatch(a, b.t(), func, bias=bias)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO either don't pass func into addmm_mx_dispatch, or we should rename it to something like addmm_or_linear_mx_dispatch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants