You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. Here we use resnet18 for finetuning, and update all the layers. Open the configure file `config/train_test_ce1.cfg`. In the `network` section we can find details for the network. Here `update_layers = 0` means updating all the layers.
16
+
1. Here we use resnet18 for finetuning, and update all the layers. Open the configure file `config/train_test_ce1.cfg`. In the `network` section we can find details for the network. Here `update_mode = all` means updating all the layers.
17
17
```bash
18
18
# type of network
19
19
net_type = resnet18
20
20
pretrain = True
21
21
input_chns = 3
22
22
# finetune all the layers
23
-
update_layers = 0
23
+
update_mode = all
24
24
```
25
25
26
26
Then start to train by running:
@@ -48,20 +48,20 @@ pymic_run test config/train_test_ce1.cfg
48
48
pymic_eval_cls config/evaluation.cfg
49
49
```
50
50
51
-
The obtained accuracy by default setting should be around 0.9412, and the AUC will be around 0.976.
51
+
The obtained accuracy by default setting should be around 0.9477, and the AUC will be around 0.9745.
52
52
53
53
3. Run `python show_roc.py` to show the receiver operating characteristic curve.
54
54
55
55

56
56
57
57
## Finetuning the last layer of resnet18
58
-
Similarly to the above example, we further try to only finetune the last layer of resnet18 for the same classification task. Use a different configure file `config/train_test_ce2.cfg` for training and testing, where `update_layers = -1` in the `network` section means updating the last layer only:
58
+
Similarly to the above example, we further try to only finetune the last layer of resnet18 for the same classification task. Use a different configure file `config/train_test_ce2.cfg` for training and testing, where `update_mode = last` in the `network` section means updating the last layer only:
59
59
```bash
60
60
net_type = resnet18
61
61
pretrain = True
62
62
input_chns = 3
63
63
# finetune the last layer only
64
-
update_layers = -1
64
+
update_mode = last
65
65
```
66
66
67
-
Edit `config/evaluation.cfg` accordinly for evaluation.
67
+
Edit `config/evaluation.cfg` accordinly for evaluation. The corresponding accuracy and AUC would be around 0.9477 and 0.9778, respectively.
Copy file name to clipboardExpand all lines: classification/CHNCXR/README.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,7 +20,7 @@ net_type = resnet18
20
20
pretrain = True
21
21
input_chns = 3
22
22
# finetune all the layers
23
-
update_layers = 0
23
+
update_mode = all
24
24
```
25
25
26
26
Start to train by running:
@@ -48,12 +48,12 @@ pymic_run test config/net_resnet18.cfg
48
48
pymic_eval_cls config/evaluation.cfg
49
49
```
50
50
51
-
The obtained accuracy by default setting should be around 0.8571, and the AUC is 0.94.
51
+
The obtained accuracy by default setting should be around 0.8271, and the AUC is 0.9343.
52
52
53
53
3. Run `python show_roc.py` to show the receiver operating characteristic curve.
54
54
55
55

56
56
57
57
58
58
## Finetuning vgg16
59
-
Similarly to the above example, we further try to finetune vgg16 for the same classification task. Use a different configure file `config/net_vg16.cfg` for training and testing. Edit `config/evaluation.cfg` accordinly for evaluation. The iteration number for the highest accuracy on the validation set was 2300, and the accuracy will be around 0.8797.
59
+
Similarly to the above example, we further try to finetune vgg16 for the same classification task. Use a different configure file `config/net_vg16.cfg` for training and testing. Edit `config/evaluation.cfg` accordinly for evaluation. The accuracy and AUC would be around 0.8571 and 0.9271, respectively.
The CLSLSR method estimates errors in the original noisy label and obtains pixel-level weight maps based on an intial model, and then uses the weight maps to suppress noises in a standard supervised learning procedure. Assume that the initial model is the baseline method, run the following command to obtain the weight maps:
76
+
77
+
```bash
78
+
python clslsr_get_condience config/unet_ce.cfg
79
+
```
80
+
81
+
The weight maps will be saved in `$root_dir/slsr_conf`. Then train the new model and do inference by:
82
+
83
+
```bash
84
+
pymic_run train config/unet_clslsr.cfg
85
+
pymic_run test config/unet_clslsr.cfg
86
+
```
87
+
88
+
Note that the weight maps for training images are specified in the configuration file `train_csv = config/data/jsrt_train_mix_clslsr.csv`.
89
+
74
90
### Co-Teaching
75
91
The configuration file for Co-Teaching is `config/unet2d_cot.cfg`. The corresponding setting is:
76
92
@@ -128,7 +144,7 @@ pymic_run test config/unet_dast.cfg
128
144
Use `pymic_eval_seg config/evaluation.cfg` for quantitative evaluation of the segmentation results. You need to edit `config/evaluation.cfg` first, for example:
0 commit comments