Skip to content

Commit a20ad5e

Browse files
Update header levels
1 parent a46b7e9 commit a20ad5e

File tree

1 file changed

+11
-11
lines changed

1 file changed

+11
-11
lines changed

docs/source/installation.mdx

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ The library can be built using CUDA Toolkit versions as old as **11.8**.
4646
> For the best results, a Turing generation device or newer is recommended.
4747
4848

49-
#### Installation via PyPI[[cuda-pip]]
49+
### Installation via PyPI[[cuda-pip]]
5050

5151
This is the most straightforward and recommended installation option.
5252

@@ -120,11 +120,11 @@ Big thanks to [wkpark](https://github.com/wkpark), [Jamezo97](https://github.com
120120
</hfoption>
121121
</hfoptions>
122122

123-
### Intel XPU[[xpu]]
123+
## Intel XPU[[xpu]]
124124

125125
* A compatible PyTorch version with Intel XPU support is required. The current minimum is **PyTorch 2.6.0.**. It is recommended to use the latest stable release. See [Getting Started on Intel GPU](https://docs.pytorch.org/docs/stable/notes/get_start_xpu.html) for guidance.
126126

127-
#### Installation via PyPI[[xpu-pip]]
127+
### Installation via PyPI[[xpu-pip]]
128128

129129
This is the most straightforward and recommended installation option.
130130

@@ -143,22 +143,22 @@ Use `pip` or `uv` to install the latest release:
143143
pip install bitsandbytes
144144
```
145145

146-
### Intel Gaudi[[gaudi]]
146+
## Intel Gaudi[[gaudi]]
147147

148148
* A compatible PyTorch version with Intel Gaudi support is required. The current minimum is **Gaudi v1.21** with **PyTorch 2.6.0.**. It is recommended to use the latest stable release. See the Gaudi software [installation guide](https://docs.habana.ai/en/latest/Installation_Guide/index.html) for guidance.
149149

150150

151-
#### Installation from PyPI[[gaudi-pip]]
151+
### Installation from PyPI[[gaudi-pip]]
152152

153153
Use `pip` or `uv` to install the latest release:
154154

155155
```bash
156156
pip install bitsandbytes
157157
```
158158

159-
### CPU[[cpu]]
159+
## CPU[[cpu]]
160160

161-
#### Installation from PyPI[[cpu-pip]]
161+
### Installation from PyPI[[cpu-pip]]
162162

163163
This is the most straightforward and recommended installation option.
164164

@@ -178,7 +178,7 @@ Use `pip` or `uv` to install the latest release:
178178
pip install bitsandbytes
179179
```
180180

181-
#### Compile from Source[[cpu-compile]]
181+
### Compile from Source[[cpu-compile]]
182182

183183
To compile from source, simply install the package from source using `pip`. The package will be built for CPU only at this time.
184184

@@ -187,12 +187,12 @@ git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bits
187187
pip install -e .
188188
```
189189

190-
### AMD ROCm (Preview)[[rocm]]
190+
## AMD ROCm (Preview)[[rocm]]
191191

192192
* A compatible PyTorch version with AMD ROCm support is required. The current minimum is **Gaudi v1.21** with **PyTorch 2.6.0.**. It is recommended to use the latest stable release. See [PyTorch on ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/3rd-party/pytorch-install.html) for guidance.
193193
* ROCm support is currently only available in our preview wheels or when building from source.
194194

195-
#### Preview Wheels from `main`[[rocm-preview]]
195+
### Preview Wheels from `main`[[rocm-preview]]
196196

197197
The currently distributed preview `bitsandbytes` are built with the following configurations:
198198

@@ -208,7 +208,7 @@ The currently distributed preview `bitsandbytes` are built with the following co
208208

209209
Please see [Preview Wheels](#preview-wheels) for installation instructions.
210210

211-
#### Compile from Source[[rocm-compile]]
211+
### Compile from Source[[rocm-compile]]
212212

213213
bitsandbytes can be compiled from ROCm 6.1 - ROCm 7.0.
214214

0 commit comments

Comments
 (0)