Skip to content

Can't load multiple loras when using Flux Control LoRA Β #10180

@jonathanyin12

Description

@jonathanyin12

Describe the bug

I was trying out the FluxControlPipeline with the Control LoRA introduced in #9999 , but had issues loading in multiple loras.

For example, if I load the depth lora first and then the 8-step lora, it errors on the 8-step lora, and if I load the 8-step lora first and then the depth lora, it errors when loading the depth lora.

Reproduction

from diffusers import FluxControlPipeline
from huggingface_hub import hf_hub_download
import torch

control_pipe = FluxControlPipeline.from_pretrained("black-forest-labs/FLUX.1-dev", torch_dtype=torch.bfloat16).to("cuda")
control_pipe.load_lora_weights("black-forest-labs/FLUX.1-Depth-dev-lora")
control_pipe.load_lora_weights(hf_hub_download("ByteDance/Hyper-SD", "Hyper-FLUX.1-dev-8steps-lora.safetensors"))

Logs

AttributeError                            Traceback (most recent call last)
Cell In[6], line 8
      5 control_pipe = FluxControlPipeline.from_pretrained("black-forest-labs/FLUX.1-dev", torch_dtype=torch.bfloat16).to("cuda")
      7 control_pipe.load_lora_weights("black-forest-labs/FLUX.1-Depth-dev-lora")
----> 8 control_pipe.load_lora_weights(
      9         hf_hub_download(
     10             "ByteDance/Hyper-SD", "Hyper-FLUX.1-dev-8steps-lora.safetensors"
     11         ),
     12         adapter_name="HyperFlux",
     13     )

File ~/.venv/lib/python3.10/site-packages/diffusers/loaders/lora_pipeline.py:1856, in FluxLoraLoaderMixin.load_lora_weights(self, pretrained_model_name_or_path_or_dict, adapter_name, **kwargs)
   1849 transformer_norm_state_dict = {
   1850     k: state_dict.pop(k)
   1851     for k in list(state_dict.keys())
   1852     if "transformer." in k and any(norm_key in k for norm_key in self._control_lora_supported_norm_keys)
   1853 }
   1855 transformer = getattr(self, self.transformer_name) if not hasattr(self, "transformer") else self.transformer
-> 1856 has_param_with_expanded_shape = self._maybe_expand_transformer_param_shape_or_error_(
   1857     transformer, transformer_lora_state_dict, transformer_norm_state_dict
   1858 )
   1860 if has_param_with_expanded_shape:
   1861     logger.info(
   1862         "The LoRA weights contain parameters that have different shapes that expected by the transformer. "
   1863         "As a result, the state_dict of the transformer has been expanded to match the LoRA parameter shapes. "
   1864         "To get a comprehensive list of parameter names that were modified, enable debug logging."
   1865     )

File ~/.venv/lib/python3.10/site-packages/diffusers/loaders/lora_pipeline.py:2316, in FluxLoraLoaderMixin._maybe_expand_transformer_param_shape_or_error_(cls, transformer, lora_state_dict, norm_state_dict, prefix)
   2314 if isinstance(module, torch.nn.Linear):
   2315     module_weight = module.weight.data
-> 2316     module_bias = module.bias.data if hasattr(module, "bias") else None
   2317     bias = module_bias is not None
   2319     lora_A_weight_name = f"{name}.lora_A.weight"

AttributeError: 'NoneType' object has no attribute 'data'

System Info

  • πŸ€— Diffusers version: 0.32.0.dev0
  • Platform: Linux-5.15.0-124-generic-x86_64-with-glibc2.35
  • Running on Google Colab?: No
  • Python version: 3.10.12
  • PyTorch version (GPU?): 2.5.1+cu124 (True)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Huggingface_hub version: 0.26.5
  • Transformers version: 4.47.0
  • Accelerate version: 1.2.0
  • PEFT version: 0.14.0
  • Bitsandbytes version: not installed
  • Safetensors version: 0.4.5
  • xFormers version: not installed
  • Accelerator: NVIDIA H100 80GB HBM3, 81559 MiB
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: No

Who can help?

@a-r-r-o-w @sayakpaul

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinghelp wantedExtra attention is neededlora

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions