-
Hi, In Thanks beforehand! |
Beta Was this translation helpful? Give feedback.
Answered by
YerePhy
Jul 12, 2022
Replies: 1 comment
-
If I am not overlooking nothing, the def forward(self, x): out = self.skip_layers(x) out = self.output_block(out) if self.training and self.deep_supervision: out_all = [out] for feature_map in self.heads: out_all.append(interpolate(feature_map, out.shape[2:])) return torch.stack(out_all, dim=1) return out Gives the answer. The final feature map (the one that is not coming from a deep supervision head) is the 0th of the final output, so you can retrive it with: your_cfg: Dict # your configuration for DynUNet input_tensor: torch.Tensor # input tensor for DynUNet with shape NCHW[D] dynunet = DynUNet(**your_cfg) final_label_map = dynunet.forward(input_tensor).select(1, 0) |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
YerePhy
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
If I am not overlooking nothing, the
forward
method ofmonai.networks.nets.DynUNet
:Gives the answer. The final feature map (the one that is not coming from a deep supervision head) is the 0th of the final output, so you can retrive it with: