SUCCESS" Hunyuan with TEACACHE from 90s/it to 17s/it making Halo Strix and Hunyuan 1.5 fly. What you need to do. #3008
bkpaine1
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Here are the two fixes I made to
/home/brent/ComfyUI/custom_nodes/teacache/nodes.py:
Fix 1: Remove broken import (line 12)
Changed:
from comfy.ldm.flux.layers import timestep_embedding, apply_mod
from comfy.ldm.lightricks.model import precompute_freqs_cis
from comfy.ldm.lightricks.symmetric_patchifier import latent_to_pixel_coords
To:
from comfy.ldm.flux.layers import timestep_embedding, apply_mod
from comfy.ldm.lightricks.symmetric_patchifier import latent_to_pixel_coords
And updated the call around line 631 from:
pe = precompute_freqs_cis(fractional_coords, dim=self.inner_dim,
out_dtype=x.dtype)
To:
pe = self._precompute_freqs_cis(
fractional_coords,
dim=self.inner_dim,
out_dtype=x.dtype,
max_pos=self.positional_embedding_max_pos,
use_middle_indices_grid=self.use_middle_indices_grid,
num_attention_heads=self.num_attention_heads,
)
Fix 2: Update Hunyuan Video function signature for 1.5 compatibility
The teacache_hunyuanvideo_forward function signature was outdated. Changed
from:
def teacache_hunyuanvideo_forward(
self,
img: Tensor,
img_ids: Tensor,
txt: Tensor,
txt_ids: Tensor,
txt_mask: Tensor,
timesteps: Tensor,
y: Tensor,
guidance: Tensor = None,
guiding_frame_index=None,
ref_latent=None,
control=None,
transformer_options={},
) -> Tensor:
To:
def teacache_hunyuanvideo_forward(
self,
img: Tensor,
img_ids: Tensor,
txt: Tensor,
txt_ids: Tensor,
txt_mask: Tensor,
timesteps: Tensor,
y: Tensor = None,
txt_byt5=None,
clip_fea=None,
guidance: Tensor = None,
guiding_frame_index=None,
ref_latent=None,
disable_time_r=False,
control=None,
transformer_options={},
) -> Tensor:
Plus updated the function body to handle the new Hunyuan 1.5 features
(time_r_in, byt5_in, vision_in, cond_type_embedding) and pass
transformer_options to block calls.
Beta Was this translation helpful? Give feedback.
All reactions