Skip to content

Commit 228063c

Browse files
Minor update
1 parent 0a35a1d commit 228063c

File tree

1 file changed

+17
-16
lines changed

1 file changed

+17
-16
lines changed

docs/source/installation.mdx

Lines changed: 17 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -23,10 +23,10 @@ Welcome to the installation guide for the `bitsandbytes` library! This document
2323
The library can be built using CUDA Toolkit versions as old as **11.6** on Windows and **11.4** on Linux.
2424

2525
| **Feature** | **CC Required** | **Example Hardware Requirement** |
26-
|---------------------------------|---------------------------------------------------------------|
27-
| LLM.int8() | 7.5+ | Turing (RTX 20 series, T4) or newer GPUs |
28-
| 8-bit optimizers/quantization | 5.0+ | Maxwell (GTX 900 series, TITAN X, M40) or newer GPUs * |
29-
| NF4/FP4 quantization | 5.0+ | Maxwell (GTX 900 series, TITAN X, M40) or newer GPUs * |
26+
|---------------------------------|-----------------|---------------------------------------------|
27+
| LLM.int8() | 7.5+ | Turing (RTX 20 series, T4) or newer GPUs |
28+
| 8-bit optimizers/quantization | 5.0+ | Maxwell (GTX 900 series, TITAN X, M40) or newer GPUs |
29+
| NF4/FP4 quantization | 5.0+ | Maxwell (GTX 900 series, TITAN X, M40) or newer GPUs |
3030

3131
> [!WARNING]
3232
> Support for Maxwell GPUs is deprecated and will be removed in a future release. For the best results, a Turing generation device or newer is recommended.
@@ -41,7 +41,8 @@ The currently distributed `bitsandbytes` packages are built with the following c
4141
|--------------------|------------------|----------------------|--------------
4242
| **Linux x86-64** | 11.8 - 12.6 | GCC 11.2 | sm50, sm60, sm75, sm80, sm86, sm89, sm90, sm100, sm120
4343
| **Linux x86-64** | 12.8 | GCC 11.2 | sm75, sm80, sm86, sm89, sm90, sm100, sm120
44-
| **Linux aarch64** | 11.8 - 12.8 | GCC 11.2 | sm75, sm80, sm90, sm100
44+
| **Linux aarch64** | 11.8 - 12.6 | GCC 11.2 | sm75, sm80, sm90
45+
| **Linux aarch64** | 12.8 | GCC 11.2 | sm75, sm80, sm90, sm100
4546
| **Windows x86-64** | 11.8 - 12.8 | MSVC 19.43+ (VS2022) | sm50, sm60, sm75, sm80, sm86, sm89, sm90, sm100, sm120
4647

4748
Use `pip` or `uv` to install:
@@ -53,7 +54,7 @@ pip install bitsandbytes
5354
### Compile from source[[cuda-compile]]
5455

5556
> [!TIP]
56-
> Don't hesitate to compile from source! The process is pretty straight forward and resilient. This might be needed for older CUDA Toolkit versions or Linux distributions, or other less common configurations/
57+
> Don't hesitate to compile from source! The process is pretty straight forward and resilient. This might be needed for older CUDA Toolkit versions or Linux distributions, or other less common configurations.
5758
5859
For Linux and Windows systems, compiling from source allows you to customize the build configurations. See below for detailed platform-specific instructions (see the `CMakeLists.txt` if you want to check the specifics and explore some additional options):
5960

@@ -68,7 +69,7 @@ For example, to install a compiler and CMake on Ubuntu:
6869
apt-get install -y build-essential cmake
6970
```
7071

71-
You should also install CUDA Toolkit by following the [NVIDIA CUDA Installation Guide for Linux](https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html) guide from NVIDIA. The current minimum supported CUDA Toolkit version that we test with is **11.8**.
72+
You should also install CUDA Toolkit by following the [NVIDIA CUDA Installation Guide for Linux](https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html) guide. The current minimum supported CUDA Toolkit version that we test with is **11.8**.
7273

7374
```bash
7475
git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/
@@ -78,12 +79,12 @@ pip install -e . # `-e` for "editable" install, when developing BNB (otherwise
7879
```
7980

8081
> [!TIP]
81-
> If you have multiple versions of CUDA installed or installed it in a non-standard location, please refer to CMake CUDA documentation for how to configure the CUDA compiler.
82+
> If you have multiple versions of the CUDA Toolkit installed or it is in a non-standard location, please refer to CMake CUDA documentation for how to configure the CUDA compiler.
8283
8384
</hfoption>
8485
<hfoption id="Windows">
8586

86-
Windows systems require Visual Studio with C++ support as well as an installation of the CUDA SDK.
87+
Compilation from source on Windows systems require Visual Studio with C++ support as well as an installation of the CUDA Toolkit.
8788

8889
To compile from source, you need CMake >= **3.22.1** and Python >= **3.9** installed. You should also install CUDA Toolkit by following the [CUDA Installation Guide for Windows](https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html) guide from NVIDIA. The current minimum supported CUDA Toolkit version that we test with is **11.8**.
8990

@@ -106,7 +107,7 @@ If you would like to use new features even before they are officially released a
106107
<hfoptions id="OS">
107108
<hfoption id="Linux">
108109

109-
```
110+
```bash
110111
# Note: if you don't want to reinstall our dependencies, append the `--no-deps` flag!
111112

112113
# x86_64 (most users)
@@ -119,7 +120,7 @@ pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsand
119120
</hfoption>
120121
<hfoption id="Windows">
121122

122-
```
123+
```bash
123124
# Note: if you don't want to reinstall our dependencies, append the `--no-deps` flag!
124125
pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl
125126
```
@@ -129,7 +130,7 @@ pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsand
129130

130131
## Deprecated: Multi-Backend Preview[[multi-backend]]
131132

132-
> [!TIP]
133+
> [!WARNING]
133134
> This functionality existed as an early technical preview and is not recommended for production use. We are in the process of upstreaming improved support for AMD and Intel hardware into the main project.
134135
135136
We provide an early preview of support for AMD and Intel hardware as part of a development branch.
@@ -149,7 +150,7 @@ For each supported backend, follow the respective instructions below:
149150

150151
To use this preview version of `bitsandbytes` with `transformers`, be sure to install:
151152

152-
```
153+
```bash
153154
pip install "transformers>=4.45.1"
154155
```
155156

@@ -204,7 +205,7 @@ pip install --force-reinstall 'https://github.com/bitsandbytes-foundation/bitsan
204205
<hfoption id="Windows">
205206
This wheel provides support for the Intel XPU platform.
206207

207-
```
208+
```bash
208209
# Note, if you don't want to reinstall our dependencies, append the `--no-deps` flag!
209210
pip install --force-reinstall 'https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_multi-backend-refactor/bitsandbytes-0.44.1.dev0-py3-none-win_amd64.whl'
210211
```
@@ -243,7 +244,7 @@ It does not need compile CPP codes, all required ops are in [intel_extension_for
243244

244245
The below commands are for Linux. For installing on Windows, please adapt the below commands according to the same pattern as described [the section above on compiling from source under the Windows tab](#cuda-compile).
245246

246-
```
247+
```bash
247248
pip install intel_extension_for_pytorch
248249
git clone --depth 1 -b multi-backend-refactor https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/
249250
pip install -e . # `-e` for "editable" install, when developing BNB (otherwise leave that out)
@@ -255,7 +256,7 @@ pip install -e . # `-e` for "editable" install, when developing BNB (otherwise
255256

256257
Please refer to [the official Ascend installations instructions](https://www.hiascend.com/document/detail/zh/Pytorch/60RC3/configandinstg/instg/insg_0001.html) for guidance on how to install the necessary `torch_npu` dependency.
257258

258-
```
259+
```bash
259260
# Install bitsandbytes from source
260261
# Clone bitsandbytes repo, Ascend NPU backend is currently enabled on multi-backend-refactor branch
261262
git clone -b multi-backend-refactor https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/

0 commit comments

Comments
 (0)