Skip to content

Commit 201daff

Browse files
authored
Add note on CUDA version + remove 'test' from pytorch whl url
1 parent 37c8f72 commit 201daff

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

README.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -38,24 +38,27 @@ Some features (especially fine-tuning with FSDP + PEFT) currently require PyTorc
3838
### Installing
3939
Llama-recipes provides a pip distribution for easy install and usage in other projects. Alternatively, it can be installed from source.
4040

41+
> [!NOTE]
42+
> Ensure you use the correct CUDA version (from `nvidia-smi`) when installing the PyTorch wheels. Here we are using 11.8 as `cu118`
43+
4144
#### Install with pip
4245
```
43-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes
46+
pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes
4447
```
4548

4649
#### Install with optional dependencies
4750
Llama-recipes offers the installation of optional packages. There are three optional dependency groups.
4851
To run the unit tests we can install the required dependencies with:
4952
```
50-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[tests]
53+
pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes[tests]
5154
```
5255
For the vLLM example we need additional requirements that can be installed with:
5356
```
54-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[vllm]
57+
pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes[vllm]
5558
```
5659
To use the sensitive topics safety checker install with:
5760
```
58-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[auditnlg]
61+
pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes[auditnlg]
5962
```
6063
Optional dependencies can also be combines with [option1,option2].
6164

@@ -65,14 +68,14 @@ To install from source e.g. for development use these commands. We're using hatc
6568
git clone [email protected]:meta-llama/llama-recipes.git
6669
cd llama-recipes
6770
pip install -U pip setuptools
68-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 -e .
71+
pip install --extra-index-url https://download.pytorch.org/whl/cu118 -e .
6972
```
7073
For development and contributing to llama-recipes please install all optional dependencies:
7174
```
7275
git clone [email protected]:meta-llama/llama-recipes.git
7376
cd llama-recipes
7477
pip install -U pip setuptools
75-
pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 -e .[tests,auditnlg,vllm]
78+
pip install --extra-index-url https://download.pytorch.org/whl/cu118 -e .[tests,auditnlg,vllm]
7679
```
7780

7881

0 commit comments

Comments
 (0)