Skip to content

Commit a9a45d8

Browse files
authored
Update README.md and main.md (#52)
Add arXiv, pip install, badges, etc.
1 parent f17db3d commit a9a45d8

File tree

3 files changed

+35
-11
lines changed

3 files changed

+35
-11
lines changed

README.md

Lines changed: 22 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,9 +6,16 @@
66
<img src="https://img.alicdn.com/imgextra/i1/O1CN01lvLpfw25Pl4ohGZnU_!!6000000007519-2-tps-1628-490.png" alt="Trinity-RFT" style="height: 120px;">
77
</div>
88

9-
109
&nbsp;
1110

11+
<div align="center">
12+
13+
[![paper](http://img.shields.io/badge/cs.LG-2505.17826-B31B1B?logo=arxiv&logoColor=red)](https://arxiv.org/abs/2505.17826)
14+
[![doc](https://img.shields.io/badge/Docs-blue?logo=markdown)](https://modelscope.github.io/Trinity-RFT/)
15+
[![pypi](https://img.shields.io/pypi/v/trinity-rft?logo=pypi&color=026cad)](https://pypi.org/project/trinity-rft/0.1.0/)
16+
![license](https://img.shields.io/badge/license-Apache--2.0-000000.svg)
17+
18+
</div>
1219

1320

1421
**Trinity-RFT is a general-purpose, flexible, scalable and user-friendly framework designed for reinforcement fine-tuning (RFT) of large language models (LLM).**
@@ -146,6 +153,12 @@ pip install flash-attn -v
146153
# pip install flash-attn -v --no-build-isolation
147154
```
148155

156+
Installation using pip:
157+
158+
```shell
159+
pip install trinity-rft==0.1.0
160+
```
161+
149162
Installation from docker:
150163
we have provided a dockerfile for Trinity-RFT (trinity)
151164

@@ -332,10 +345,13 @@ This project is built upon many excellent open-source projects, including:
332345

333346
## Citation
334347
```plain
335-
@misc{Trinity-RFT,
336-
title={Trinity-RFT},
337-
author={{Trinity-RFT Team}},
338-
url={https://github.com/modelscope/trinity-rft},
339-
year={2025},
348+
@misc{trinity-rft,
349+
title={Trinity-RFT: A General-Purpose and Unified Framework for Reinforcement Fine-Tuning of Large Language Models},
350+
author={Xuchen Pan and Yanxi Chen and Yushuo Chen and Yuchang Sun and Daoyuan Chen and Wenhao Zhang and Yuexiang Xie and Yilun Huang and Yilei Zhang and Dawei Gao and Yaliang Li and Bolin Ding and Jingren Zhou},
351+
year={2025},
352+
eprint={2505.17826},
353+
archivePrefix={arXiv},
354+
primaryClass={cs.LG},
355+
url={https://arxiv.org/abs/2505.17826},
340356
}
341357
```
458 KB
Loading

docs/sphinx_doc/source/main.md

Lines changed: 13 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -124,6 +124,11 @@ pip install flash-attn -v
124124
# pip install flash-attn -v --no-build-isolation
125125
```
126126

127+
Installation using pip:
128+
129+
```shell
130+
pip install trinity-rft==0.1.0
131+
```
127132

128133
Installation from docker:
129134
we have provided a dockerfile for Trinity-RFT (trinity)
@@ -319,10 +324,13 @@ This project is built upon many excellent open-source projects, including:
319324

320325
## Citation
321326
```
322-
@misc{Trinity-RFT,
323-
title={Trinity-RFT},
324-
author={{Trinity-RFT Team}},
325-
url={https://github.com/modelscope/trinity-rft},
326-
year={2025},
327+
@misc{trinity-rft,
328+
title={Trinity-RFT: A General-Purpose and Unified Framework for Reinforcement Fine-Tuning of Large Language Models},
329+
author={Xuchen Pan and Yanxi Chen and Yushuo Chen and Yuchang Sun and Daoyuan Chen and Wenhao Zhang and Yuexiang Xie and Yilun Huang and Yilei Zhang and Dawei Gao and Yaliang Li and Bolin Ding and Jingren Zhou},
330+
year={2025},
331+
eprint={2505.17826},
332+
archivePrefix={arXiv},
333+
primaryClass={cs.LG},
334+
url={https://arxiv.org/abs/2505.17826},
327335
}
328336
```

0 commit comments

Comments
 (0)