Skip to content

Commit 87a92f7

Browse files
dg845sayakpaul
andauthored
Fix bug in ResnetBlock2D.forward where LoRA Scale gets Overwritten (#6736)
Fix bug in ResnetBlock2D.forward when not USE_PEFT_BACKEND and using scale_shift for time emb where the lora scale gets overwritten. Co-authored-by: Sayak Paul <[email protected]>
1 parent 0db766b commit 87a92f7

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/diffusers/models/resnet.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -384,9 +384,9 @@ def forward(
384384
raise ValueError(
385385
f" `temb` should not be None when `time_embedding_norm` is {self.time_embedding_norm}"
386386
)
387-
scale, shift = torch.chunk(temb, 2, dim=1)
387+
time_scale, time_shift = torch.chunk(temb, 2, dim=1)
388388
hidden_states = self.norm2(hidden_states)
389-
hidden_states = hidden_states * (1 + scale) + shift
389+
hidden_states = hidden_states * (1 + time_scale) + time_shift
390390
else:
391391
hidden_states = self.norm2(hidden_states)
392392

0 commit comments

Comments
 (0)