Skip to content

Commit a6ee660

Browse files
committed
_empty_cache -> clear_cache
1 parent 582af9b commit a6ee660

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/model_loading_utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -256,7 +256,7 @@ def load_model_dict_into_meta(
256256

257257
if is_accelerate_version(">=", "1.9.0.dev0"):
258258
set_module_kwargs["non_blocking"] = True
259-
set_module_kwargs["_empty_cache"] = False
259+
set_module_kwargs["clear_cache"] = False
260260

261261
# For compatibility with PyTorch load_state_dict which converts state dict dtype to existing dtype in model, and which
262262
# uses `param.copy_(input_param)` that preserves the contiguity of the parameter in the model.

0 commit comments

Comments
 (0)