Skip to content

Commit 5fa1d32

Browse files
authored
[+] update nnUNet.md
1 parent 06d0e93 commit 5fa1d32

File tree

1 file changed

+22
-0
lines changed

1 file changed

+22
-0
lines changed

nnUNet/nnUNet.md

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,28 @@
44

55
### Preprocessing
66

7+
The preprocessing pipeline follows the official nnU-Net v2 workflow
8+
(reorientation, resampling, intensity normalisation, etc.).
9+
710
### Augmentations
811

12+
PSAT introduces a contraction-based augmentation strategy to expose the network
13+
to stronger anatomical scaling. This is achieved by modifying the
14+
`SpatialTransform` in nnU-Net so that random scaling factors are sampled from a
15+
wider range `(0.7, 2.5)` rather than the default `(0.7, 1.4)`. Mirroring is disabled
16+
for the experiments.
17+
918
### Training
19+
20+
Several custom trainers are provided to support different experimental setups.
21+
`nnUNetTrainer_400epochs_NoMirroring_Finetune_lr_1e4` and
22+
`nnUNetTrainer_200epochs_NoMirroring_Finetune_lr_1e4` perform fine‑tuning with a
23+
lower learning rate and without mirroring. The
24+
`nnUNetTrainer_1000epochs_NoMirroring_contraction` variants train from scratch
25+
for 1000 epochs while applying the contraction augmentation described above.
26+
27+
All trainers inherit from `nnUNetTrainer` and thus retain full compatibility
28+
with the nnU-Net v2 command line tools.
29+
30+
31+
You have to place the scripts in your nnUNetv2 library corresponding folders.

0 commit comments

Comments
 (0)