Skip to content

Commit f135598

Browse files
Merge pull request #259 from stochasticai/dev
Docs update
2 parents e0761cf + 77911b4 commit f135598

File tree

2 files changed

+14
-14
lines changed

2 files changed

+14
-14
lines changed

docs/docs/overview/intro.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ pip install xturing
2929

3030
**Welcome to xTuring: Personalize AI your way**
3131

32-
In the world of AI, personalization is incredibly important for making AI truly powerful. This is where xTuring comes in – it's a special open-source software that helps you make AI models, called Large Language Models (LLMs), work exactly the way you want them to.
32+
In the world of AI, personalization is incredibly important for making AI truly powerful. This is where xTuring comes in – it's a special open-source software that helps you make AI models, called Large Language Models (LLMs), work exactly the way you want them to.
3333

3434
What's great about xTuring is that it's super easy to use. It has a simple interface that's designed to help you customize LLMs for your specific needs, whether it's for your own data or applications. Basically, xTuring gives you complete control over personalizing AI, making it work just the way you need it to.
3535

@@ -53,16 +53,16 @@ To get started with xTuring, check out the [Quickstart](/overview/quickstart) gu
5353

5454
| Model | Examples |
5555
| --- | --- |
56-
| Bloom | [Bloom fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/bloom) |
57-
| Cerebras-GPT | [Cerebras-GPT fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/cerebras) |
58-
| Falcon | [Falcon 7B fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/falcon) |
59-
| Galactica | [Galactica fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/galactica) |
60-
| Generic Wrapper | [Any large language model fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/generic) |
61-
| GPT-J | [GPT-J 6B LoRA fine-tuning with/without INT8 ](https://github.com/stochasticai/xturing/tree/main/examples/gptj) |
62-
| GPT-2 | [GPT-2 fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/gpt2) |
63-
| LLaMA | [LLaMA 7B fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/llama) |
64-
| LLaMA 2 | [LLaMA 2 7B fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/llama2) |
65-
| OPT | [OPT fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/opt) |
56+
| Bloom | [Bloom fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/bloom) |
57+
| Cerebras-GPT | [Cerebras-GPT fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/cerebras) |
58+
| Falcon | [Falcon 7B fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/falcon) |
59+
| Galactica | [Galactica fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/galactica) |
60+
| Generic Wrapper | [Any large language model fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/generic) |
61+
| GPT-J | [GPT-J 6B LoRA fine-tuning with/without INT8 ](https://github.com/stochasticai/xturing/tree/main/examples/models/gptj) |
62+
| GPT-2 | [GPT-2 fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/gpt2) |
63+
| LLaMA | [LLaMA 7B fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/llama) |
64+
| LLaMA 2 | [LLaMA 2 7B fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/llama2) |
65+
| OPT | [OPT fine-tuning on Alpaca dataset with/without LoRA and with/without INT8](https://github.com/stochasticai/xturing/tree/main/examples/models/opt) |
6666

6767
xTuring is licensed under [Apache 2.0](https://github.com/stochasticai/xturing/blob/main/LICENSE)
6868

@@ -85,4 +85,4 @@ The people who created xTuring come from a place called Stochastic, where lots o
8585

8686
**Here to Help You Succeed**: Our job doesn't stop with making xTuring. We're here to help you learn and use AI in the best way possible. We want you to feel confident using our tool in the fast-changing world of AI.
8787

88-
[Come Work with Us](/contributing) and be part of the future of AI with xTuring. We're all about new ideas and making AI better for everyone. We're here to help you every step of the way.
88+
[Come Work with Us](/contributing) and be part of the future of AI with xTuring. We're all about new ideas and making AI better for everyone. We're here to help you every step of the way.

docs/docusaurus.config.js

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ const config = {
6868
items: [
6969
{
7070
href: 'https://github.com/stochasticai/xturing',
71-
label: 'xTuring',
71+
label: 'GitHub',
7272
position: 'right',
7373
}
7474
],
@@ -106,7 +106,7 @@ const config = {
106106
title: 'More',
107107
items: [
108108
{
109-
label: 'Github',
109+
label: 'GitHub',
110110
href: 'https://github.com/stochasticai/xturing',
111111
},
112112
],

0 commit comments

Comments
 (0)