Skip to content

Commit 93bd8ee

Browse files
committed
fix lumina embedding forward to not depend on weight dtype
1 parent 5956a9e commit 93bd8ee

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/embeddings.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1787,7 +1787,7 @@ def __init__(self, hidden_size=4096, cross_attention_dim=2048, frequency_embeddi
17871787
def forward(self, timestep, caption_feat, caption_mask):
17881788
# timestep embedding:
17891789
time_freq = self.time_proj(timestep)
1790-
time_embed = self.timestep_embedder(time_freq.to(dtype=self.timestep_embedder.linear_1.weight.dtype))
1790+
time_embed = self.timestep_embedder(time_freq.to(dtype=caption_feat.dtype))
17911791

17921792
# caption condition embedding:
17931793
caption_mask_float = caption_mask.float().unsqueeze(-1)

0 commit comments

Comments
 (0)