We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent c771d7e commit 7ede8f2Copy full SHA for 7ede8f2
src/diffusers/models/transformers/transformer_sd3.py
@@ -369,7 +369,6 @@ def forward(
369
If `return_dict` is True, an [`~models.transformer_2d.Transformer2DModelOutput`] is returned, otherwise a
370
`tuple` where the first element is the sample tensor.
371
"""
372
- joint_attention_kwargs = joint_attention_kwargs or {}
373
if joint_attention_kwargs is not None:
374
joint_attention_kwargs = joint_attention_kwargs.copy()
375
lora_scale = joint_attention_kwargs.pop("scale", 1.0)
0 commit comments