Skip to content

Commit 422979b

Browse files
awaelchlilantiga
authored andcommitted
Fix bitsandbytes layer conversion under init_module context manager (#18914)
(cherry picked from commit 13e1926)
1 parent 15d4361 commit 422979b

File tree

3 files changed

+8
-2
lines changed

3 files changed

+8
-2
lines changed

requirements/pytorch/extra.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,3 +8,4 @@ hydra-core >=1.0.5, <1.4.0
88
jsonargparse[signatures] >=4.18.0, <4.26.0
99
rich >=12.3.0, <13.6.0
1010
tensorboardX >=2.2, <2.7.0 # min version is set by torch.onnx missing attribute
11+
bitsandbytes <=0.41.1

src/lightning/fabric/CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
3636
- Refined the FSDP saving logic and error messaging when path exists ([#18884](https://github.com/Lightning-AI/lightning/pull/18884))
3737

3838

39+
- Fixed layer conversion under `Fabric.init_module()` context manager when using the `BitsandbytesPrecision` plugin ([#18914](https://github.com/Lightning-AI/lightning/pull/18914))
40+
41+
3942
## [2.1.0] - 2023-10-11
4043

4144
### Added

src/lightning/fabric/strategies/strategy.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -138,11 +138,13 @@ def module_init_context(self, empty_init: Optional[bool] = None) -> ContextManag
138138
If ``None``, the strategy will decide. Some strategies may not support all options.
139139
140140
"""
141-
tensor_init_ctx = self.tensor_init_context()
141+
precision_module_ctx = self.precision.module_init_context()
142142
stack = ExitStack()
143+
if _TORCH_GREATER_EQUAL_2_0:
144+
stack.enter_context(self.root_device)
143145
if _TORCH_GREATER_EQUAL_1_13:
144146
stack.enter_context(_EmptyInit(enabled=bool(empty_init)))
145-
stack.enter_context(tensor_init_ctx)
147+
stack.enter_context(precision_module_ctx)
146148
return stack
147149

148150
def setup_module_and_optimizers(

0 commit comments

Comments
 (0)