You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+19-30Lines changed: 19 additions & 30 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,7 +31,7 @@ Fair and benchmark for dataset distillation.
31
31
32
32
-[2025/01] Our PyPI package is officially released! Users can now install DD-Ranking via `pip install ddranking`.
33
33
34
-
-[2024/12] We officially released DD-Ranking! DD-Ranking provides us a new benchmark decoupling the impacts from knowledge distillation and data augmentation.
34
+
-[2024/12/28] We officially released DD-Ranking! DD-Ranking provides us a new benchmark decoupling the impacts from knowledge distillation and data augmentation.
35
35
</details>
36
36
37
37
---
@@ -284,40 +284,29 @@ Please check out [CONTRIBUTING.md](./CONTRIBUTING.md) for how to get involved.
284
284
285
285
DD-Ranking is released under the MIT License. See [LICENSE](./LICENSE) for more details.
286
286
287
-
<!--## Acknowledgement
287
+
## Related Works
288
288
289
-
DD-Ranking is a community project. The compute resources for development and testing are supported by the following organizations. Thanks for your support! -->
289
+
-[Dataset Distillation](https://arxiv.org/abs/1811.10959), Wang et al., in arXiv 2018.
290
+
-[Dataset Condensation with Gradient Matching](https://arxiv.org/abs/2006.05929), Zhao et al., in ICLR 2020.
291
+
-[Dataset Condensation with Differentiable Siamese Augmentation](https://arxiv.org/abs/2102.08259), Zhao \& Bilen, in ICML 2021.
292
+
-[Dataset Distillation via Matching Training Trajectories](https://arxiv.org/abs/2203.11932), Cazenavette et al., in CVPR 2022.
293
+
-[Dataset Distillation with Distribution Matching](https://arxiv.org/abs/2110.04181), Zhao \& Bilen, in WACV 2023.
294
+
-[Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective](https://arxiv.org/abs/2306.13092), Yin et al., in NeurIPS 2023.
295
+
-[Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching](https://arxiv.org/abs/2310.05773), Guo et al., in ICLR 2024.
296
+
-[On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm](https://arxiv.org/abs/2312.03526), Sun et al., in CVPR 2024.
297
+
-[D4M: Dataset Distillation via Disentangled Diffusion Model](https://arxiv.org/abs/2407.15138), Su et al., in CVPR 2024.
290
298
291
-
<!-- Note: Please sort them in alphabetical order. -->
292
-
<!-- Note: Please keep these consistent with docs/source/community/sponsors.md -->
293
299
294
-
<!-- - First Org.
300
+
## Reference
295
301
296
-
We also have an official fundraising venue through <span style="color: #ff0000;">[TODO]:</span>[the collection website](). We plan to use the fund to support the development, maintenance, and adoption of DD-Ranking. -->
302
+
If you find DD-Ranking useful in your research, please consider citing the following paper:
297
303
298
-
<!-- Paper to be added -->
299
-
<!-- If a pre-print is wanted, a digital asset could be released first. -->
300
-
301
-
<!-- ## Citation
302
-
303
-
If you use DD-Ranking for your research, please cite our [paper]():
304
304
```bibtex
305
-
@inproceedings{,
306
-
title={DD-Ranking: },
307
-
author={},
308
-
booktitle={},
309
-
year={2024}
305
+
@misc{li2024ddranking,
306
+
title = {DD-Ranking: Rethinking the Evaluation of Dataset Distillation},
307
+
author = {Li, Zekai and Zhong, Xinhao and Liang, Zhiyuan and Zhou, Yuhao and Shi, Mingjia and Wang, Ziqiao and Zhao, Wangbo and Zhao, Xuanlei and Wang, Haonan and Qin, Ziheng and Liu, Dai and Zhang, Kaipeng and Zhou, Tianyi and Zhu, Zheng and Wang, Kun and Li, Guang and Zhang, Junhao and Liu, Jiawei and Huang, Yiran and Lyu, Lingjuan and Lv, Jiancheng and Jin, Yaochu and Akata, Zeynep and Gu, Jindong and Vedantam, Rama and Shou, Mike and Deng, Zhiwei and Yan, Yan and Shang, Yuzhang and Cazenavette, George and Wu, Xindi and Cui, Justin and Chen, Tianlong and Yao, Angela and Kellis, Manolis and Plataniotis, Konstantinos N. and Zhao, Bo and Wang, Zhangyang and You, Yang and Wang, Kai},
**Community Discussions**: Engage with other users on <span style="color: #ff0000;">[TODO]:</span>[Discord]() for discussions.
316
-
317
-
**Coordination of Contributions and Development**: Use <span style="color: #ff0000;">[TODO]:</span>[Slack]() for coordinating contributions and managing development efforts.
318
-
319
-
**Collaborations and Partnerships**: For exploring collaborations or partnerships, reach out via <span style="color: #ff0000;">[TODO]:</span>[email]().
320
-
321
-
**Technical Queries and Feature Requests**: Utilize GitHub issues or discussions for addressing technical questions and proposing new features.
0 commit comments