Skip to content

Commit 717b5ad

Browse files
committed
make lora target modules configurable and change the default
1 parent 6e2cb75 commit 717b5ad

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

examples/advanced_diffusion_training/train_dreambooth_lora_flux_advanced.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -661,7 +661,8 @@ def parse_args(input_args=None):
661661
"--lora_blocks",
662662
type=str,
663663
default=None,
664-
help=('The transformer modules to apply LoRA training on'),
664+
help=(
665+
'The transformer modules to apply LoRA training on. Please specify the layers in a comma seperated. E.g. - "q_proj,k_proj,v_proj,out_proj" will result in lora training of attention layers only'),
665666
)
666667
parser.add_argument(
667668
"--adam_epsilon",

0 commit comments

Comments
 (0)