Understanding LocalNormalizedCrossCorrelationLoss #6645
-
Hello, I'm trying to understand how to use
I would expect the cross-correlation of an image with itself to be |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
thanks, I've tested this example will return -1... import torch
from monai.losses import LocalNormalizedCrossCorrelationLoss
lncc_loss = LocalNormalizedCrossCorrelationLoss(kernel_size=3)
example_image = torch.randn(3, 2, 128, 128, 128, dtype=float)
self_loss = lncc_loss(example_image, example_image)
print(self_loss) but MONAI/monai/losses/image_dissimilarity.py Lines 70 to 71 in 6de86a4 |
Beta Was this translation helpful? Give feedback.
-
Okay that makes sense. Since an image with all constants will have a zero variance, the LNCC will be undefined so we have to add a small smoothing constant. In my case, the image had a lot of background pixels that were constant and so this was ending up affecting the LNCC. If I compute the loss with |
Beta Was this translation helpful? Give feedback.
Okay that makes sense. Since an image with all constants will have a zero variance, the LNCC will be undefined so we have to add a small smoothing constant. In my case, the image had a lot of background pixels that were constant and so this was ending up affecting the LNCC. If I compute the loss with$-1$ . Thanks for your help in understanding this :)
reduction="none"
and mask out the background pixels, the LNCC is indeed