Skip to content

Commit e09f1b7

Browse files
authored
Apply suggestions from code review
1 parent 6dd7ff6 commit e09f1b7

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/diffusers/models/transformers/transformer_cogview4.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -583,7 +583,7 @@ def forward(self, hidden_states: torch.Tensor) -> Tuple[torch.Tensor, torch.Tens
583583
return (freqs.cos(), freqs.sin())
584584

585585

586-
class _CogViewFinalAdaLayerNormContinuous(nn.Module):
586+
class CogView4AdaLayerNormContinuous(nn.Module):
587587
"""
588588
CogView4-only final AdaLN: LN(x) -> Linear(cond) -> chunk -> affine.
589589
Matches Megatron: **no activation** before the Linear on conditioning embedding.
@@ -696,7 +696,7 @@ def __init__(
696696
)
697697

698698
# 4. Output projection
699-
self.norm_out = _CogViewFinalAdaLayerNormContinuous(inner_dim, time_embed_dim, elementwise_affine=False)
699+
self.norm_out = CogView4AdaLayerNormContinuous(inner_dim, time_embed_dim, elementwise_affine=False)
700700
self.proj_out = nn.Linear(inner_dim, patch_size * patch_size * out_channels, bias=True)
701701

702702
self.gradient_checkpointing = False

0 commit comments

Comments
 (0)