Skip to content

Commit 61a4cb1

Browse files
authored
Update README.md
1 parent d4a33c1 commit 61a4cb1

File tree

1 file changed

+15
-1
lines changed

1 file changed

+15
-1
lines changed

README.md

Lines changed: 15 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,8 @@
4141

4242

4343
## News
44+
🔥🔥 [2023/11/07] [MFTCoder Paper](https://arxiv.org/abs/2311.02303) has been released on Arxiv, which introduces most technique details of multi-task-fine-tuning.
45+
4446
🔥🔥 [2023/10/20] [CodeFuse-QWen-14B](https://huggingface.co/codefuse-ai/CodeFuse-QWen-14B) has been released, achieving a pass@1 (greedy decoding) score of 48.8% on HumanEval, which gains 16% absolute improvement over the base model [Qwen-14b](https://huggingface.co/Qwen/Qwen-14B)
4547

4648
🔥🔥 [2023/09/27] [CodeFuse-StarCoder-15B](https://huggingface.co/codefuse-ai/CodeFuse-StarCoder-15B) has been released, achieving a pass@1 (greedy decoding) score of 54.9% on HumanEval.
@@ -70,7 +72,7 @@
7072

7173

7274
## Articles
73-
TBA
75+
[MFT Arxiv paper](https://arxiv.org/abs/2311.02303)
7476

7577
## Introduction
7678

@@ -142,6 +144,18 @@ We are also pleased to release two code-related instruction datasets, meticulous
142144
|-----------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------|
143145
| [⭐ Evol-instruction-66k](https://huggingface.co/datasets/codefuse-ai/Evol-instruction-66k) | Based on open-evol-instruction-80k, filter out low-quality, repeated, and similar instructions to HumanEval, thus get high-quality code instruction dataset. |
144146
| [⭐ CodeExercise-Python-27k](https://huggingface.co/datasets/codefuse-ai/CodeExercise-Python-27k) | python code exercise instruction dataset generated by chatgpt |
147+
## Citation
148+
If you use our codes or models, or feel our project useful for your R&D works, please cite our paper as below.
149+
'''
150+
@article{mftcoder2023,
151+
title={MFTCoder: Boosting Code LLMs with Multitask Fine-Tuning},
152+
author={Bingchang Liu and Chaoyu Chen and Cong Liao and Zi Gong and Huan Wang and Zhichao Lei and Ming Liang and Dajun Chen and Min Shen and Hailian Zhou and Hang Yu and Jianguo Li},
153+
year={2023},
154+
journal={arXiv preprint arXiv},
155+
archivePrefix={arXiv},
156+
eprint={2311.02303}
157+
}
158+
'''
145159

146160
## Star-History
147161

0 commit comments

Comments
 (0)