Skip to content

Suppress 1e-5 discrepancy warnings for detection probability#286

Merged
chiarabellenghi merged 4 commits intomasterfrom
fix/274-detprob-norm
Apr 1, 2026
Merged

Suppress 1e-5 discrepancy warnings for detection probability#286
chiarabellenghi merged 4 commits intomasterfrom
fix/274-detprob-norm

Conversation

@chiarabellenghi
Copy link
Copy Markdown
Member

Fixes #274

Copilot AI review requested due to automatic review settings March 31, 2026 15:01
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR addresses issue #274 by adjusting how detection probabilities derived from the public-data effective area are normalized, aiming to avoid small numerical deviations from unity that were triggering warnings in the signal energy PDF construction.

Changes:

  • Update PDAeff.get_detection_prob_for_decnu to normalize detection probabilities by renormalizing the per-bin integrals to sum to 1.
  • Remove the warning in PDSignalEnergyPDFSet initialization that logged when detection probabilities didn’t sum exactly to unity.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.

File Description
skyllh/analyses/i3/publicdata_ps/signalpdf.py Removes the warning about det_prob not summing to 1 during signal energy PDF setup.
skyllh/analyses/i3/publicdata_ps/aeff.py Changes normalization approach for detection probabilities to enforce unity by construction.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 286 to 287
self._logger.debug(f'det_prob = {det_prob}, sum = {np.sum(det_prob)}')

Copy link

Copilot AI Mar 31, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now that the “sum not unity” warning is removed, it would be good to explicitly guard against degenerate det_prob results (e.g., all zeros / non-finite). Downstream true_e_prob = p / np.sum(p) will produce NaNs if np.sum(p) == 0, so consider raising a clear error (or at least logging) when np.sum(det_prob) <= 0 or ~np.isfinite(det_prob).all() after computing det_prob.

Suggested change
self._logger.debug(f'det_prob = {det_prob}, sum = {np.sum(det_prob)}')
# Guard against degenerate or non-finite detection probabilities.
if not np.isfinite(det_prob).all():
msg = (
f'Non-finite detection probabilities encountered for src_dec={src_dec}. '
f'det_prob={det_prob}'
)
self._logger.error(msg)
raise ValueError(msg)
total_det_prob = np.sum(det_prob)
if total_det_prob <= 0:
msg = (
f'Non-positive total detection probability (sum={total_det_prob}) '
f'for src_dec={src_dec}. det_prob={det_prob}'
)
self._logger.error(msg)
raise ValueError(msg)
self._logger.debug(f'det_prob = {det_prob}, sum = {total_det_prob}')

Copilot uses AI. Check for mistakes.
@chiarabellenghi
Copy link
Copy Markdown
Member Author

I realized that it makes sense to compute the detection probability normalized in each bin. It is more general.

I am still not sure whether it makes sense to have arbitrary limits for the normalization range, as this should probably be the minimum and maximum energy given in the tabulated effA.

In any case, for the moment I will simply loosen the tolerance of np.isclose in the signal energy PDF class.

@chiarabellenghi chiarabellenghi changed the title Simplify detection probability normalization for public data Suppress 1e-5 discrepancy warnings for detection probability Mar 31, 2026
@chiarabellenghi chiarabellenghi requested a review from Copilot March 31, 2026 15:34
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 1 out of 1 changed files in this pull request and generated 2 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

self._logger.debug(f'det_prob = {det_prob}, sum = {np.sum(det_prob)}')

if not np.isclose(np.sum(det_prob), 1):
if not np.isclose(np.sum(det_prob), 1, rtol=1e-3, atol=0):
Copy link

Copilot AI Mar 31, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new tolerance is very loose: with rtol=1e-3 this check will no longer warn even for a 0.1% normalization error (e.g. sum=0.999). If the intent is only to suppress ~1e-5 rounding discrepancies, consider tightening this to something like rtol=1e-4 (as suggested in #274) so genuinely mis-normalized inputs still trigger the warning.

Suggested change
if not np.isclose(np.sum(det_prob), 1, rtol=1e-3, atol=0):
if not np.isclose(np.sum(det_prob), 1, rtol=1e-4, atol=0):

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think numerical drifts smaller than 0.1% are fine for the analysis.

Comment on lines 286 to 289
self._logger.debug(f'det_prob = {det_prob}, sum = {np.sum(det_prob)}')

if not np.isclose(np.sum(det_prob), 1):
if not np.isclose(np.sum(det_prob), 1, rtol=1e-3, atol=0):
self._logger.warning(f'The sum of the detection probabilities is not unity! It is {np.sum(det_prob)}.')
Copy link

Copilot AI Mar 31, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

np.sum(det_prob) is computed multiple times (debug log, isclose check, and warning). Consider storing the sum in a local variable and reusing it to keep the code DRY and ensure the logged value is exactly the one being checked.

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Copy link
Copy Markdown
Collaborator

@tomaskontrimas tomaskontrimas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🏁 🚴

@chiarabellenghi chiarabellenghi merged commit 277ef68 into master Apr 1, 2026
7 checks passed
@chiarabellenghi chiarabellenghi deleted the fix/274-detprob-norm branch April 1, 2026 07:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Detection probability normalization in public data energy PDF not close enough to unity

3 participants