Skip to content

Commit 613a32d

Browse files
committed
modify comment to explain reasoning behind hidden size check
1 parent b1ece65 commit 613a32d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

examples/dreambooth/train_dreambooth_lora_sd3.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1295,7 +1295,7 @@ def save_model_hook(models, weights, output_dir):
12951295
if isinstance(model, type(unwrap_model(transformer))):
12961296
transformer_lora_layers_to_save = get_peft_model_state_dict(model)
12971297
elif isinstance(model, type(unwrap_model(text_encoder_one))): # or text_encoder_two
1298-
# check hidden size to distinguish between text_encoder_one and two
1298+
# both text encoders are of the same class, so we check hidden size to distinguish between the two
12991299
hidden_size = unwrap_model(model).config.hidden_size
13001300
if hidden_size == 768:
13011301
text_encoder_one_lora_layers_to_save = get_peft_model_state_dict(model)

0 commit comments

Comments
 (0)