We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent d45199a commit 4b17fa2Copy full SHA for 4b17fa2
src/diffusers/models/transformers/transformer_flux.py
@@ -384,7 +384,7 @@ def forward(
384
temb: torch.Tensor,
385
image_rotary_emb: Optional[Tuple[torch.Tensor, torch.Tensor]] = None,
386
joint_attention_kwargs: Optional[Dict[str, Any]] = None,
387
- ) -> torch.Tensor:
+ ) -> Tuple[torch.Tensor, torch.Tensor]:
388
text_seq_len = encoder_hidden_states.shape[1]
389
hidden_states = torch.cat([encoder_hidden_states, hidden_states], dim=1)
390
0 commit comments