Skip to content

fix: prevent division by zero in loss when no targets#662

Open
Mr-Neutr0n wants to merge 1 commit intoWongKinYiu:mainfrom
Mr-Neutr0n:fix/loss-division-by-zero
Open

fix: prevent division by zero in loss when no targets#662
Mr-Neutr0n wants to merge 1 commit intoWongKinYiu:mainfrom
Mr-Neutr0n:fix/loss-division-by-zero

Conversation

@Mr-Neutr0n
Copy link
Copy Markdown

Bug

When a batch contains no positive targets, the loss computation in ComputeLoss_NEW divides by zero (via .mean() on an empty tensor), producing NaN loss values. This can happen in two cases:

  1. all_loss is empty when no labels exist, causing zip(*all_loss) to fail
  2. The validity mask ij has no True entries, causing cat_loss[0][ij].mean() to return NaN

Fix

Added guards to ensure:

  • The loss assignment block only runs when all_loss is non-empty
  • .mean() is only called when ij has at least one True value
  • Objectness loss is still computed against all-zero targets when no positive targets exist, preserving correct gradient flow

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant