We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 31544d4 commit d9eabf8Copy full SHA for d9eabf8
src/diffusers/models/transformers/transformer_allegro.py
@@ -317,9 +317,9 @@ def forward(
317
p_t = self.config.patch_size_t
318
p = self.config.patch_size
319
320
- post_patch_num_frames = num_frames // self.config.patch_size_temporal
321
- post_patch_height = height // self.config.patch_size
322
- post_patch_width = width // self.config.patch_size
+ post_patch_num_frames = num_frames // p_t
+ post_patch_height = height // p
+ post_patch_width = width // p
323
324
# ensure attention_mask is a bias, and give it a singleton query_tokens dimension.
325
# we may have done this conversion already, e.g. if we came here via UNet2DConditionModel#forward.
0 commit comments