DiceMetric gives weird results #7427
-
Hi! I am training a MONAI It either gives an error, or 1 or 0 or numbers above 1. I use the loss like this: In the tutorials "AsDiscreted" was a common step. I do not want to use MONAI transforms, so this is what I tried:
I included the commented lines, because this, and different combinations of this is what I tried.
Could you please tell and explain me, how should I properly use the DiceMetric? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Hi @franciskasara, could you please paste the error message? |
Beta Was this translation helpful? Give feedback.
-
Okay, I found the problem.
This is how it works properly: (although it is not going to be 1-DiceLoss.)
|
Beta Was this translation helpful? Give feedback.
Okay, I found the problem.
In the threshold version the problem was that
voutputs_bin=torch.nn.functional.threshold(voutputs_bin,threshold=0.5,value=1)
does not binarize the tensor. It sets everything under 0.5 to 1, and leaves the rest of it untouched.
In the argmax version the problem was that after
voutputs_bin=torch.argmax(voutputs_bin,dim=1)
I did not set the
num_classes
parameter inDiceMetric
to 2, so it messed it up:"num_classes – number of input channels (always including the background). When this is None, y_pred.shape[1]"
This is how it works properly: (although it is not going to be 1-DiceLoss.)