You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<ahref="https://github.com/modelscope/swift/"><imgsrc="https://img.shields.io/badge/ms--swift-Build from source-6FEBB9.svg"></a>
20
+
</p>
16
21
22
+
## 📖 Table of Contents
23
+
-[Introduction](#-introduction)
24
+
-[News](#-news)
25
+
-[LLM Training and Inference Example](#-llm-training-and-inference-example)
26
+
-[Installation](#-installation)
27
+
-[Getting Started](#-getting-started)
28
+
-[Learn More](#-learn-more)
29
+
-[License](#-license)
30
+
-[Contact Us](#-contact-us)
31
+
32
+
## 📝 Introduction
17
33
SWIFT (Scalable lightWeight Infrastructure for Fine-Tuning) is an extensible framwork designed to faciliate lightweight model fine-tuning and inference. It integrates implementations for various efficient fine-tuning methods, by embracing approaches that is parameter-efficient, memory-efficient, and time-efficient. SWIFT integrates seamlessly into ModelScope ecosystem and offers the capabilities to finetune various models, with a primary emphasis on LLMs and vision models. Additionally, SWIFT is fully compatible with [PEFT](https://github.com/huggingface/peft), enabling users to leverage the familiar Peft interface to finetune ModelScope models.
18
34
19
35
Currently supported approches (and counting):
@@ -40,7 +56,7 @@ Key features:
40
56
Users can check the [documentation of SWIFT](docs/source/GetStarted/快速使用.md) to get detail tutorials.
41
57
42
58
43
-
###🎉 News
59
+
## 🎉 News
44
60
- 🔥 2023.11.24: Support for **yi-34b-chat**, **codefuse-codellama-34b-chat**: The corresponding shell script can be found in [yi_34b_chat](https://github.com/modelscope/swift/tree/main/examples/pytorch/llm/scripts/yi_34b_chat), [codefuse_codellama_34b_chat](https://github.com/modelscope/swift/tree/main/examples/pytorch/llm/scripts/codefuse_codellama_34b_chat).
45
61
- 🔥 2023.11.18: Support for **tongyi-finance-14b** series models: tongyi-finance-14b, tongyi-finance-14b-chat, tongyi-finance-14b-chat-int4. The corresponding shell script can be found in [tongyi_finance_14b_chat_int4](https://github.com/modelscope/swift/tree/main/examples/pytorch/llm/scripts/tongyi_finance_14b_chat_int4).
46
62
- 🔥 2023.11.16: Added support for more models in **flash attn**: qwen series, qwen-vl series, llama series, openbuddy series, mistral series, yi series, ziya series. Please use the `use_flash_attn` parameter.
@@ -66,8 +82,14 @@ Users can check the [documentation of SWIFT](docs/source/GetStarted/快速使用
66
82
- 2023.9.3: Supported **baichuan2** model series: baichuan2-7b, baichuan2-7b-chat, baichuan2-13b, baichuan2-13b-chat.
67
83
68
84
69
-
## ✨ LLM SFT Example
70
-
Users can refer to the [LLM fine-tuning documentation](https://github.com/modelscope/swift/tree/main/examples/pytorch/llm) for more detailed information.
85
+
## ✨ LLM Training and Inference Example
86
+
### Simple Usage
87
+
- Quickly perform inference on LLM, see the [LLM Inference Documentation](https://github.com/modelscope/swift/blob/main/docs/source/LLM/LLM推理文档.md).
88
+
- Rapidly fine-tune and perform inference on LLM, and build a Web-UI. See the [LLM Fine-tuning Documentation](https://github.com/modelscope/swift/blob/main/docs/source/LLM/LLM微调文档.md).
89
+
- View the models and datasets supported by Swift. You can check [supported models and datasets](https://github.com/modelscope/swift/blob/main/docs/source/LLM/支持的模型和数据集.md).
90
+
- Expand and customize models, datasets, and dialogue templates in Swift, see [Customization and Expansion](https://github.com/modelscope/swift/blob/main/docs/source/LLM/自定义和拓展.md).
91
+
- Check command-line hyperparameters for fine-tuning and inference, see [Command-Line Hyperparameters](https://github.com/modelscope/swift/blob/main/docs/source/LLM/命令行超参数.md)
Quickly fine-tune, infer with LLM, and build a Web-UI.
113
-
114
-
To see more sh startup scripts, please refer to: [Run SFT and Inference](https://github.com/modelscope/swift/tree/main/examples/pytorch/llm#-run-sft-and-inference)
SWIFT supports multiple tuners, as well as tuners provided by [PEFT](https://github.com/huggingface/peft). To use these tuners, simply call:
298
162
@@ -430,20 +294,20 @@ output
430
294
The config/weights stored in the output dir is the config of `extra_state_keys` and the weights of it. This is different from PEFT, which stores the weights and config of the `default` tuner.
ModelScope Library is the model library of ModelScope project, which contains a large number of popular models.
438
302
439
303
-[Contribute your own model to ModelScope](https://modelscope.cn/docs/ModelScope%E6%A8%A1%E5%9E%8B%E6%8E%A5%E5%85%A5%E6%B5%81%E7%A8%8B%E6%A6%82%E8%A7%88)
440
304
441
-
# License
305
+
##License
442
306
443
307
This project is licensed under the [Apache License (Version 2.0)](https://github.com/modelscope/modelscope/blob/master/LICENSE).
444
308
445
309
446
-
# Contact Us
310
+
##Contact Us
447
311
You can contact and communicate with us by joining our WeChat Group:
0 commit comments