Skip to content

Commit 9059b37

Browse files
committed
fix
1 parent 3516159 commit 9059b37

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/transformers/transformer_flux.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -493,7 +493,7 @@ def forward(
493493

494494
if joint_attention_kwargs is not None and "ip_adapter_image_embeds" in joint_attention_kwargs:
495495
ip_adapter_image_embeds = joint_attention_kwargs.pop("ip_adapter_image_embeds")
496-
ip_hidden_states = self.transformer.encoder_hid_proj(ip_adapter_image_embeds)
496+
ip_hidden_states = self.encoder_hid_proj(ip_adapter_image_embeds)
497497
joint_attention_kwargs.update({"ip_hidden_states": ip_hidden_states})
498498

499499
for index_block, block in enumerate(self.transformer_blocks):

0 commit comments

Comments
 (0)