Skip to content

StephenEkaputra/Swapped-Logit-Distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

47 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Swapped Logit Distillation via Bi-Level Teacher Alignment

Abstract


Knowledge distillation (KD) compresses the network capacity by transferring knowledge from a large (teacher) network to a smaller one (student). It has been mainstream that the teacher directly transfers knowledge to the student with its original distribution, which can possibly lead to incorrect predictions. In this article, we propose a logit-based distillation via swapped logit processing, namely Swapped Logit Distillation (SLD). SLD is proposed under two assumptions: (1) the wrong prediction occurs when the prediction label confidence is not the maximum; (2) the “natural” limit of probability remains uncertain as the best value addition to the target cannot be determined. To address these issues, we propose a swapped logit processing scheme. Through this approach, we find that the swap method can be effectively extended to teacher and student outputs, transforming into two teachers. We further introduce loss scheduling to boost the performance of two teachers' alignment. Extensive experiments on image classification tasks demonstrate that SLD consistently performs best among previous state-of-the-art methods.

Installation

Environments:

  • Python 3.8
  • PyTorch 1.7.0

Install the package:

sudo pip3 install -r requirements.txt
sudo python3 setup.py develop

SLD Framework

CIFAR-100

Training on ImageNet

  • Download the dataset at https://image-net.org/ and put them to ./data/imagenet

    python3 tools/train_ours.py --cfg configs/imagenet/r34_r18/sld.yaml

Acknowledgement

Thanks for the contributions to the codebase. The code is built on mdistiller and mlkd.

Contact

Stephen: stephenekaputra@gmail.com

Citation

If this repo is helpful for your research, please consider citing the paper:

@article{limantoro2025sld,
  title={Swapped logit distillation via bi-level teacher alignment},
  author={Limantoro, Stephen Ekaputra and Lin, Jhe-Hao and Wang, Chih-Yu and Tsai, Yi-Lung and Shuai, Hong-Han and Huang, Ching-Chun and Cheng, Wen-Huang},
  journal={Multimedia systems},
  volume={31},
  number={3},
  pages={264},
  year={2025},
  publisher={Springer}
}

About

Swapped Logit Distillation via Bi-Level Teachers Alignment

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages