The negative values of train dice. #4469
-
Hi, |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
This happens for example if you use the binary formulation for dice with multiclass inputs, since the classes after the first two have larger values the subtraction can work out to be negative. It could also be that your prediction isn't being activated with sigmoid if binary or with softmax if multi-class, these values must be activated before going into the loss function. There are questions on dice that keep coming up along these lines so I wrote a notebook on the loss and metric that explains some of the subtle issues: https://github.com/Project-MONAI/tutorials/blob/main/modules/dice_loss_metric_notes.ipynb I hope that helps. |
Beta Was this translation helpful? Give feedback.
This happens for example if you use the binary formulation for dice with multiclass inputs, since the classes after the first two have larger values the subtraction can work out to be negative. It could also be that your prediction isn't being activated with sigmoid if binary or with softmax if multi-class, these values must be activated before going into the loss function. There are questions on dice that keep coming up along these lines so I wrote a notebook on the loss and metric that explains some of the subtle issues: https://github.com/Project-MONAI/tutorials/blob/main/modules/dice_loss_metric_notes.ipynb I hope that helps.