Skip to content

Commit f07cf1e

Browse files
committed
Update readme to use stable release during installation
1 parent 362c866 commit f07cf1e

File tree

1 file changed

+10
-10
lines changed

1 file changed

+10
-10
lines changed

README.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ The 'llama-recipes' repository is a companion to the [Meta Llama 2](https://gith
1010
> `<\|eot_id\|>` | This signifies the end of the message in a turn. |
1111
> `<\|start_header_id\|>{role}<\|end_header_id\|>` | These tokens enclose the role for a particular message. The possible roles can be: system, user, assistant. |
1212
> `<\|end_of_text\|>` | This is equivalent to the EOS token. On generating this token, Llama 3 will cease to generate more tokens |
13-
>
13+
>
1414
> A multiturn-conversation with Llama 3 follows this prompt template:
1515
> ```
1616
> <|begin_of_text|><|start_header_id|>system<|end_header_id|>
@@ -26,7 +26,7 @@ The 'llama-recipes' repository is a companion to the [Meta Llama 2](https://gith
2626
> More details on the new tokenizer and prompt template: <PLACEHOLDER_URL>
2727
> [!NOTE]
2828
> The llama-recipes repository was recently refactored to promote a better developer experience of using the examples. Some files have been moved to new locations. The `src/` folder has NOT been modified, so the functionality of this repo and package is not impacted.
29-
>
29+
>
3030
> Make sure you update your local clone by running `git pull origin main`
3131
3232
## Table of Contents
@@ -55,29 +55,29 @@ These instructions will get you a copy of the project up and running on your loc
5555
### Prerequisites
5656
5757
#### PyTorch Nightlies
58-
Some features (especially fine-tuning with FSDP + PEFT) currently require PyTorch nightlies to be installed. Please make sure to install the nightlies if you're using these features following [this guide](https://pytorch.org/get-started/locally/).
58+
I you want to use PyTorch nightlies instead of the stable release, go to [this guide](https://pytorch.org/get-started/locally/) to retrieve the right `--extra-index-url URL` parameter for the `pip install` commands on your platform.
5959
6060
### Installing
6161
Llama-recipes provides a pip distribution for easy install and usage in other projects. Alternatively, it can be installed from source.
6262
6363
#### Install with pip
6464
```
65-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes
65+
pip install llama-recipes
6666
```
6767
6868
#### Install with optional dependencies
6969
Llama-recipes offers the installation of optional packages. There are three optional dependency groups.
7070
To run the unit tests we can install the required dependencies with:
7171
```
72-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[tests]
72+
pip install llama-recipes[tests]
7373
```
7474
For the vLLM example we need additional requirements that can be installed with:
7575
```
76-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[vllm]
76+
pip install llama-recipes[vllm]
7777
```
7878
To use the sensitive topics safety checker install with:
7979
```
80-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[auditnlg]
80+
pip install llama-recipes[auditnlg]
8181
```
8282
Optional dependencies can also be combines with [option1,option2].
8383
@@ -87,14 +87,14 @@ To install from source e.g. for development use these commands. We're using hatc
8787
git clone [email protected]:meta-llama/llama-recipes.git
8888
cd llama-recipes
8989
pip install -U pip setuptools
90-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 -e .
90+
pip install -e .
9191
```
9292
For development and contributing to llama-recipes please install all optional dependencies:
9393
```
9494
git clone [email protected]:meta-llama/llama-recipes.git
9595
cd llama-recipes
9696
pip install -U pip setuptools
97-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 -e .[tests,auditnlg,vllm]
97+
pip install -e .[tests,auditnlg,vllm]
9898
```
9999
100100
@@ -120,7 +120,7 @@ python src/transformers/models/llama/convert_llama_weights_to_hf.py \
120120

121121

122122
## Repository Organization
123-
Most of the code dealing with Llama usage is organized across 2 main folders: `recipes/` and `src/`.
123+
Most of the code dealing with Llama usage is organized across 2 main folders: `recipes/` and `src/`.
124124

125125
### `recipes/`
126126

0 commit comments

Comments
 (0)