Replies: 1 comment
-
Hi @lidialuq, I think we didn't use act and norm in the deep_supervision_heads. It will add an "adn" module without activation and normalization when you set MONAI/monai/networks/blocks/dynunet_block.py Lines 260 to 261 in ab800d8 MONAI/monai/networks/blocks/convolutions.py Lines 158 to 171 in ab800d8
Hope it can help you, thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
The last output layer in the DynUNet implementation uses a convolution + ADN (activation-dropout-normalization). The deep supervision outputs use only a convolution, which is what I would expect from the last layer too. Does anyone know why it's implemented this way?
Here's the output form torchsummary from a DynUNet with 2 deep supervision outputs:
Beta Was this translation helpful? Give feedback.
All reactions