Skip to content

Commit 1524122

Browse files
authored
[Transformer2DModel] don't norm twice (#1381)
don't norm twice
1 parent f07a16e commit 1524122

File tree

1 file changed

+0
-2
lines changed

1 file changed

+0
-2
lines changed

src/diffusers/models/attention.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -201,13 +201,11 @@ def forward(self, hidden_states, encoder_hidden_states=None, timestep=None, retu
201201
residual = hidden_states
202202

203203
hidden_states = self.norm(hidden_states)
204-
205204
if not self.use_linear_projection:
206205
hidden_states = self.proj_in(hidden_states)
207206
inner_dim = hidden_states.shape[1]
208207
hidden_states = hidden_states.permute(0, 2, 3, 1).reshape(batch, height * weight, inner_dim)
209208
else:
210-
hidden_states = self.norm(hidden_states)
211209
inner_dim = hidden_states.shape[1]
212210
hidden_states = hidden_states.permute(0, 2, 3, 1).reshape(batch, height * weight, inner_dim)
213211
hidden_states = self.proj_in(hidden_states)

0 commit comments

Comments
 (0)