-
Notifications
You must be signed in to change notification settings - Fork 6.4k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
The latest lightning LoRA for Wan2.2 i2v fails to load.
Link to LoRA: Latest
Reproduction
model_id = "Wan-AI/Wan2.2-I2V-A14B-Diffusers"
pipe = WanImageToVideoPipeline.from_pretrained(
model_id,
torch_dtype=torch.bfloat16
)
lightning_hn = hf_hub_download(repo_id="lightx2v/Wan2.2-Distill-Loras", filename="wan2.2_i2v_A14b_high_noise_lora_rank64_lightx2v_4step_1022.safetensors")
lightning_ln = hf_hub_download(repo_id="lightx2v/Wan2.2-Distill-Loras", filename="wan2.2_i2v_A14b_low_noise_lora_rank64_lightx2v_4step_1022.safetensors")
pipe.load_lora_weights(
lightning_hn,
adapter_name="lightning"
)
pipe.load_lora_weights(
lightning_ln,
adapter_name="lightning_2"
)Logs
KeyError Traceback (most recent call last)
Cell In[5], line 31
28 lightning_hn = hf_hub_download(repo_id="lightx2v/Wan2.2-Distill-Loras", filename="wan2.2_i2v_A14b_high_noise_lora_rank64_lightx2v_4step_1022.safetensors")
29 lightning_ln = hf_hub_download(repo_id="lightx2v/Wan2.2-Distill-Loras", filename="wan2.2_i2v_A14b_low_noise_lora_rank64_lightx2v_4step_1022.safetensors")
---> 31 pipe.load_lora_weights(
32 lightning_hn,
33 adapter_name="lightning"
34 )
36 pipe.load_lora_weights(
37 lightning_ln,
38 adapter_name="lightning_2"
39 )
41 pipe.transformer.load_lora_adapter(load_wan_lora(lightning_hn), adapter_name="high_noise")
File /usr/local/lib/python3.11/dist-packages/diffusers/loaders/lora_pipeline.py:4066, in WanLoraLoaderMixin.load_lora_weights(self, pretrained_model_name_or_path_or_dict, adapter_name, hotswap, **kwargs)
4064 # First, ensure that the checkpoint is a compatible one and can be successfully loaded.
4065 kwargs["return_lora_metadata"] = True
-> 4066 state_dict, metadata = self.lora_state_dict(pretrained_model_name_or_path_or_dict, **kwargs)
4067 # convert T2V LoRA to I2V LoRA (when loaded to Wan I2V) by adding zeros for the additional (missing) _img layers
4068 state_dict = self._maybe_expand_t2v_lora_for_i2v(
4069 transformer=getattr(self, self.transformer_name) if not hasattr(self, "transformer") else self.transformer,
4070 state_dict=state_dict,
4071 )
File /usr/local/lib/python3.11/dist-packages/huggingface_hub/utils/_validators.py:114, in validate_hf_hub_args.<locals>._inner_fn(*args, **kwargs)
111 if check_use_auth_token:
112 kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
--> 114 return fn(*args, **kwargs)
File /usr/local/lib/python3.11/dist-packages/diffusers/loaders/lora_pipeline.py:3980, in WanLoraLoaderMixin.lora_state_dict(cls, pretrained_model_name_or_path_or_dict, **kwargs)
3965 state_dict, metadata = _fetch_state_dict(
3966 pretrained_model_name_or_path_or_dict=pretrained_model_name_or_path_or_dict,
3967 weight_name=weight_name,
(...)
3977 allow_pickle=allow_pickle,
3978 )
3979 if any(k.startswith("diffusion_model.") for k in state_dict):
-> 3980 state_dict = _convert_non_diffusers_wan_lora_to_diffusers(state_dict)
3981 elif any(k.startswith("lora_unet_") for k in state_dict):
3982 state_dict = _convert_musubi_wan_lora_to_diffusers(state_dict)
File /usr/local/lib/python3.11/dist-packages/diffusers/loaders/lora_conversion_utils.py:1981, in _convert_non_diffusers_wan_lora_to_diffusers(state_dict)
1976 converted_state_dict["condition_embedder.time_proj.lora_B.bias"] = original_state_dict.pop(
1977 "time_projection.1.diff_b"
1978 )
1980 if any("head.head" in k for k in state_dict):
-> 1981 converted_state_dict["proj_out.lora_A.weight"] = original_state_dict.pop(
1982 f"head.head.{lora_down_key}.weight"
1983 )
1984 converted_state_dict["proj_out.lora_B.weight"] = original_state_dict.pop(f"head.head.{lora_up_key}.weight")
1985 if "head.head.diff_b" in original_state_dict:
KeyError: 'head.head.lora_down.weight'System Info
- π€ Diffusers version: 0.36.0.dev0
- Platform: Linux-6.5.0-27-generic-x86_64-with-glibc2.35
- Running on Google Colab?: No
- Python version: 3.11.10
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Huggingface_hub version: 0.35.3
- Transformers version: 4.57.1
- Accelerate version: 1.11.0
- PEFT version: 0.17.1
- Bitsandbytes version: not installed
- Safetensors version: 0.6.2
- xFormers version: not installed
- Accelerator: NVIDIA H100 80GB HBM3, 81559 MiB
- Using GPU in script?: No
- Using distributed or parallel set-up in script?: No
Who can help?
sayakpaul
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working