We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent d4b7f98 commit 88136f5Copy full SHA for 88136f5
src/diffusers/loaders/lora_conversion_utils.py
@@ -1614,8 +1614,6 @@ def _convert_non_diffusers_wan_lora_to_diffusers(state_dict):
1614
f"Removed {diff_k} key from the state dict as it's all zeros, or values lower than hardcoded threshold."
1615
)
1616
original_state_dict.pop(diff_k)
1617
- else:
1618
- print(diff_k, absdiff)
1619
1620
# For the `diff_b` keys, we treat them as lora_bias.
1621
# https://huggingface.co/docs/peft/main/en/package_reference/lora#peft.LoraConfig.lora_bias
0 commit comments