This project is the official implementation of our “BiDM: Pushing the Limit of Quantization for Diffusion Models”. [PDF]
Establish a virtual environment and install dependencies as referred to latent-diffusion.
- Replace the existing
main.pyin the LDM with our version ofmain.py. - Place
openaimodel_ours.pyandutil_ours.pyin the directory./ldm/modules/diffusionmodules. - Place
ddpm_ours.pyandddim_ours.pyin the directory./ldm/models/diffusion - run
bash train.sh
- Results for LDM in unconditional generation by DDIM with 100 steps.
- Samples generated by the binarized DM baseline and BiDM under W1A1 bit-width.
- Our codebase builds on latent-diffusion and stable-diffusion. Thanks for open-sourcing!
If you find BinaryDM is useful and helpful to your work, please kindly cite this paper:
@inproceedings{zhengbidm,
title={BiDM: Pushing the Limit of Quantization for Diffusion Models},
author={Zheng, Xingyu and Liu, Xianglong and Bian, Yichen and Ma, Xudong and Zhang, Yulun and Wang, Jiakai and Guo, Jinyang and Qin, Haotong},
booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems}
}


