Skip to content

Latest commit

 

History

History
33 lines (25 loc) · 1.82 KB

File metadata and controls

33 lines (25 loc) · 1.82 KB

Di4C: Distillation of Discrete Diffusion through Dimensional Correlations [ICML 2025]

This repository contains the code used in the paper "Distillation of Discrete Diffusion through Dimensional Correlations":

This repository is organized as follows (Section numbers follow the arXiv version):

  • tauldr/ contains the code for Section 5.1, which is based on tauLDR.
  • maskgit-pytorch/ contains the code for Section 5.2, which is based on MaskGIT-pytorch.
  • sdtt/ contains the code for Section 5.3, which is based on SDTT.

In each repository, we provide an implementation of mixture modeling on top of the teacher model and the Di4C training/inference scripts.

Model checkpoints

The Di4C-distilled model checkpoints are available on Zenodo as follows:

  • tldr-di4c.pt is the student model in Section 5.1 (Table 1).
  • maskgit-di4c-d.pth is the di4c-d model in Section 5.2 (Figure 3).
  • sdtt6-di4c2.ckpt is the sdtt-6 + di4c^2 model in Section 5.3 (Figure 4).
  • sdtt7-di4c2.ckpt is the sdtt-7 + di4c^2 model in Section 5.3 (Figure 4).

Citation

@inproceedings{hayakawa2025distillation,
  title={Distillation of Discrete Diffusion through Dimensional Correlations},
  author={Hayakawa, Satoshi and Takida, Yuhta and Imaizumi, Masaaki and Wakaki, Hiromi and Mitsufuji, Yuki},
  booktitle={Proceedings of the 42nd International Conference on Machine Learning},
  pages={22259--22297}
  year={2025}
}