Skip to content

Conversation

@eclipse1605
Copy link
Contributor

@codecov
Copy link

codecov bot commented Jan 22, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 90.89%. Comparing base (cadb97a) to head (49f0145).
⚠️ Report is 39 commits behind head on main.

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #8067      +/-   ##
==========================================
+ Coverage   90.22%   90.89%   +0.66%     
==========================================
  Files         116      123       +7     
  Lines       18972    19489     +517     
==========================================
+ Hits        17117    17714     +597     
+ Misses       1855     1775      -80     
Files with missing lines Coverage Δ
pymc/logprob/__init__.py 100.00% <100.00%> (ø)
pymc/logprob/arithmetic.py 100.00% <100.00%> (ø)

... and 39 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Member

@ricardoV94 ricardoV94 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, left some minor comments

Comment on lines 137 to 138
if not filter_measurable_variables(node.inputs):
return None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not needed, NormalRV is always measurable

Suggested change
if not filter_measurable_variables(node.inputs):
return None



@node_rewriter([Sum])
def find_measurable_sum(fgraph: FunctionGraph, node: Apply) -> list[TensorVariable] | None:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't be in this file. Perhaps arithmetic.py?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

true, but wont tensor.py be better for this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's more meant for shape / concatenation operations, not mathematical in nature

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

makes sense

Comment on lines 142 to 143
if getattr(latent_op, "ndim_supp", None) != 0:
return None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not needed, always the case for NormalRV

Suggested change
if getattr(latent_op, "ndim_supp", None) != 0:
return None

return None
if getattr(latent_op, "ndim_supp", None) != 0:
return None
base_var = cast(TensorVariable, base_var)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't add type casts

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was also failing mypy

Comment on lines 149 to 150
if axis != tuple(range(base_var.ndim)):
return None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could support this, just means we do the mean/std aggregations along the summed axis

Comment on lines 166 to 167
mu_sum = pt.sum(mu_b)
sigma_sum = pt.sqrt(pt.sum(pt.square(sigma_b)))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will handle arbitrary axes?

Suggested change
mu_sum = pt.sum(mu_b)
sigma_sum = pt.sqrt(pt.sum(pt.square(sigma_b)))
mu_sum = pt.sum(mu_b, axis=axis)
sigma_sum = pt.sqrt(pt.sum(pt.square(sigma_b), axis=axis))

sigma_sum = pt.sqrt(pt.sum(pt.square(sigma_b)))

# Create a scalar NormalRV for the sum
rng = base_var.owner.inputs[0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you are using all args anyway just unpack them once:

rng, size, mu, sigma = base_var.owner.inputs

@eclipse1605
Copy link
Contributor Author

@ricardoV94 does this seem good?

@ricardoV94
Copy link
Member

@eclipse1605 left some more comments. Also for future PRs, feel free to close addressed comments

@eclipse1605
Copy link
Contributor Author

@eclipse1605 left some more comments. Also for future PRs, feel free to close addressed comments

sure, you mean "resolve conversation" right?

@ricardoV94
Copy link
Member

@eclipse1605 left some more comments. Also for future PRs, feel free to close addressed comments

sure, you mean "resolve conversation" right?

Yup

@ricardoV94 ricardoV94 merged commit c868a84 into pymc-devs:main Jan 26, 2026
42 checks passed
@ricardoV94
Copy link
Member

ricardoV94 commented Jan 26, 2026

We should follow up with pt.add(*Normals), and pt.sub(Normal, Normal). Different syntax, same math

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ENH: pymc.math.sum could not be observed

2 participants