Skip to content
/ UAKD Public

Uncertainty Driven Adaptive Self-Knowledge Distillation for Medical Image Segmentation

Notifications You must be signed in to change notification settings

Guoxt/UAKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Uncertainty Driven Adaptive Self-Knowledge Distillation for Medical Image Segmentation


Introduction

Deep learning have made great progress in medical image segmentation. However, the labels in the training set are often hard labels (i.e., one-hot vectors), which can easily lead to overfitting. To mitigate this problem, we propose an uncertainty driven adaptive self-knowledge distillation (UAKD) model for medical image segmentation, which regularizes the model training through the soft labels generated by itself. UAKD incorporates uncertainty estimation into the self-distillation framework, leverages teacher network ensembles to mitigate the semantic errors in the estimated soft labels caused by the fitting biases of the teacher networks. And we propose a novel adaptive distillation mechanism that leverages class uncertainty awareness to enhance the efficient transfer of knowledge from the teacher network to the student network. Further, we propose a cyclic ensemble method based on gradient ascent to estimate uncertainty. This approach improves the performance of UAKD compared to Monte Carlo dropout and significantly reduces computational costs compared to traditional deep ensemble methods.

Framework

Image Alt Text


Run Code

  1. train

python main.py --patch_size 12 --in_channels 1 --T 2.5 --labels 2 # Setting Training Parameters

  1. test

python test.py --patch_size 12 --in_channels 1 --T 2.5 --labels 2 # Setting Testing Parameters

About

Uncertainty Driven Adaptive Self-Knowledge Distillation for Medical Image Segmentation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages