Skip to content

Commit 2f76ca7

Browse files
committed
up
1 parent 7761732 commit 2f76ca7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

examples/dreambooth/train_dreambooth_lora_qwen_image.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1091,7 +1091,7 @@ def main(args):
10911091
if args.lora_layers is not None:
10921092
target_modules = [layer.strip() for layer in args.lora_layers.split(",")]
10931093
else:
1094-
target_modules = ["to_k", "to_q", "to_v", "to_out"]
1094+
target_modules = ["to_k", "to_q", "to_v", "to_out.0"]
10951095

10961096
# now we will add new LoRA weights the transformer layers
10971097
transformer_lora_config = LoraConfig(

0 commit comments

Comments
 (0)