Skip to content

Commit eb1c331

Browse files
committed
Updates README and CHANGELOG.
1 parent 89e3b82 commit eb1c331

File tree

5 files changed

+12
-6
lines changed

5 files changed

+12
-6
lines changed

CHANGELOG.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -204,14 +204,20 @@ Improvements:
204204

205205
### 0.38.0
206206

207-
#### 8-bit Lion, Load/Store 8-bit layers
207+
#### 8-bit Lion, Load/Store 8-bit Models directly from/to HF Hub
208208

209209
Features:
210210
- Support for 32 and 8-bit Lion has been added. Thank you @lucidrains
211211
- Support for serialization of Linear8bitLt layers (LLM.int8()). This allows to store and load 8-bit weights directly from the HuggingFace Hub. Thank you @myrab
212+
- New bug report features `python -m bitsandbytes` now gives extensive debugging details to debug CUDA setup failures.
212213

213214
Bug fixes:
214215
- Fixed a bug where some bitsandbytes methods failed in a model-parallel setup on multiple GPUs. Thank you @tonylins
216+
- Fixed a bug where cudart.so libraries could not be found in newer PyTorch releases.
217+
218+
Improvements:
219+
- Improved the CUDA Setup procedure by doing a more extensive search for CUDA libraries
215220

216221
Deprecated:
217222
- Devices with compute capability 3.0 (GTX 700s, K10) and 3.2 (Tegra K1, Jetson TK1) are now deprecated and support will be removed in 0.39.0.
223+
- Support for CUDA 10.0 and 10.2 will be removed in bitsandbytes 0.39.0

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,7 @@ To compile from source, you need an installation of CUDA. If `nvcc` is not insta
148148
```bash
149149
wget https://raw.githubusercontent.com/TimDettmers/bitsandbytes/main/cuda_install.sh
150150
# Syntax cuda_install CUDA_VERSION INSTALL_PREFIX EXPORT_TO_BASH
151-
# CUDA_VERSION in {110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121}
151+
# CUDA_VERSION in {110, 111, 112, 113, 114, 115, 116, 117, 118, 120, 121}
152152
# EXPORT_TO_BASH in {0, 1} with 0=False and 1=True
153153

154154
# For example, the following installs CUDA 11.8 to ~/local/cuda-11.8 and exports the path to your .bashrc

compile_from_source.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ You can install CUDA locally without sudo by following the following steps:
1111
```bash
1212
wget https://raw.githubusercontent.com/TimDettmers/bitsandbytes/main/cuda_install.sh
1313
# Syntax cuda_install CUDA_VERSION INSTALL_PREFIX EXPORT_TO_BASH
14-
# CUDA_VERSION in {110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121}
14+
# CUDA_VERSION in {110, 111, 112, 113, 114, 115, 116, 117, 118, 120, 121}
1515
# EXPORT_TO_BASH in {0, 1} with 0=False and 1=True
1616

1717
# For example, the following installs CUDA 11.7 to ~/local/cuda-11.7 and exports the path to your .bashrc

cuda_install.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ FILE=$(basename $URL)
7777
if [[ -n "$CUDA_VERSION" ]]; then
7878
echo $URL
7979
echo $FILE
80-
wget $URL
80+
#wget $URL
8181
bash $FILE --no-drm --no-man-page --override --toolkitpath=$BASE_PATH/$FOLDER/ --toolkit --silent
8282
if [ "$EXPORT_BASHRC" -eq "1" ]; then
8383
echo "export LD_LIBRARY_PATH=\$LD_LIBRARY_PATH:$BASE_PATH/$FOLDER/lib64" >> ~/.bashrc

deploy.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,8 +10,8 @@ if [[ ! -z "${LD_LIBRARY_PATH}" ]]; then
1010
fi
1111

1212

13-
module unload cuda
14-
module unload gcc
13+
module unload cuda && echo "no module function available. Probably not on a slurm cluster."
14+
module unload gcc && echo "no module function available. Probably not on a slurm cluster."
1515

1616
rm -rf dist build
1717
make cleaneggs

0 commit comments

Comments
 (0)