Support training the classification network on histology datasets. Explore multiple tricks based on stain variations.
Run main.py to train the models.
Tune all the hyper-parameters in config.yaml.
train_root: Path to the training set.test_root: Path to the test set.output_path: Path to the output. Output files will be exported to a folder created inoutput_pathstarted with the date, hence no worry for overriding.
Datasets can be downloadee use preprocess/download.py.
- Normalized-v1: Stain normalized with template
NORM-AAAWMSFI.tif(from training set). - Normalized-v2: Stain normalized with template
STR-AAEILWWE.tif(from training set). - Normalized-v3: Stain normalized with template
NORM-TCGA-AASSYQPA.tif(from test set).
Generally, we don't depend on any specific libraries.
If datasets are not on you device, use download.py to download the zip files and unzip in the terminal.
- Firstly, change the hyperparameters in
config.yaml, e.g.,train_rootpointed to the training set,test_rootto the validation set,output_pathto the output path where loggings and checkpoints are saved. - To train the model, simpily run
python main.py
LabPreNorm: Learnable normalization parameters (i.e., channel mean and channel std) of the template in LAB color space, and use the Reinhard's normalization method.LabEMAPreNorm: Use EMA to update the normalization template. Hyper-parameter: lambda. When lambda=0, degenerates to vanilla Reinhard's normalization method; when lambda=1, degenerates to a speical case ofLabRandNorm.LabRandNorm: Randomly select template in each mini-batch, and use the Reinhard's normalization method.
| ResNet-18 | w/o Pretrain | w/ Pretrain |
|---|---|---|
| w/o Norm | 64.958 | 58.788 |
| w/ Norm v1 | 78.914 | 78.106 |
| w/ Norm v3 | 89.624 | 89.262 |
| w/ RandNorm | 88.454 | |
| w/ PreNorm | 92.549 | |
| w/ EMAPreNorm (lambd=0) | 91.504 |