Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/diffusers/loaders/lora_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -2313,7 +2313,7 @@ def _maybe_expand_transformer_param_shape_or_error_(
for name, module in transformer.named_modules():
if isinstance(module, torch.nn.Linear):
module_weight = module.weight.data
module_bias = module.bias.data if hasattr(module, "bias") else None
module_bias = module.bias.data if module.bias is not None else None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make this more explicit:

module_bias = None
if getattr(module, "bias", None) is not None:
    module_bias = module.bias.data

WDYT? Additionally, do you think we should add a test for this? When we merged the Control LoRA PR, we did run all the integration tests too and we didn't face an issue.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this is under an if statement where we know the following layer is nn.Linear, we know that there is already a module.bias parameter. So I think the current condition is okay.

Yes, the integration tests when I last tested did not fail either. I think we can investigate again.

We should add one more fast test here IMO that tests for a failure when:

  • First lora loaded expands the shape
  • Second lora is of normal shape without the expansion

This usecase is not supported yet, so until it is, we expect that an error will be raised.

Another fast test that we should probably have is loading lora into nn.Linear with/without bias present, for a total of 4 tests (lora with/without, linear with/without - some are already covered with existing test suite).

WDYT @sayakpaul? We can do this in a separate PR since the change here is minimal and the actual correct thing to do

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me but let's high-prioritize the tests then (immediately after this merge) given how important LoRAs are.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good. I'll open a PR after merging this with the mentioned tests

bias = module_bias is not None

lora_A_weight_name = f"{name}.lora_A.weight"
Expand Down
Loading