-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
๐ Bug Report: NoneType object has no attribute cache_context when switching from transformer to transformer2
Description
I'm using the WanPipeline from the diffusers library to generate video frames. When running the pipeline, I encounter an error during inference, specifically when the model switches from transformer to transformer2. The error seems to be related to cache_context("cond") being called on a NoneType object.
Code Snippet
import torch
from diffusers import WanPipeline
from diffusers.schedulers.scheduling_unipc_multistep import UniPCMultistepScheduler
dtype = torch.bfloat16
device = "cuda:1"
pipe = WanPipeline.from_pretrained(
"/data/code/haobang.geng/models/Wan2.2-T2V-A14B-Diffusers",
torch_dtype=dtype,
boundary_ratio=0.9,
)
pipe.to(device)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config, flow_shift=3.0)
output = pipe(
prompt="Two anthropomorphic cats in comfy boxing gear and bright gloves fight intensely on a spotlighted stage.",
negative_prompt="่ฒ่ฐ่ณไธฝ๏ผ่ฟๆ๏ผ้ๆ๏ผ็ป่ๆจก็ณไธๆธ
๏ผๅญๅน๏ผ้ฃๆ ผ๏ผไฝๅ๏ผ็ปไฝ๏ผ็ป้ข๏ผ้ๆญข๏ผๆดไฝๅ็ฐ๏ผๆๅทฎ่ดจ้๏ผไฝ่ดจ้๏ผJPEGๅ็ผฉๆฎ็๏ผไธ้็๏ผๆฎ็ผบ็๏ผๅคไฝ็ๆๆ๏ผ็ปๅพไธๅฅฝ็ๆ้จ๏ผ็ปๅพไธๅฅฝ็่ธ้จ๏ผ็ธๅฝข็๏ผๆฏๅฎน็๏ผๅฝขๆ็ธๅฝข็่ขไฝ๏ผๆๆ่ๅ๏ผ้ๆญขไธๅจ็็ป้ข๏ผๆไนฑ็่ๆฏ๏ผไธๆก่
ฟ๏ผ่ๆฏไบบๅพๅค๏ผๅ็่ตฐ",
height=832,
width=480,
num_frames=81,
guidance_scale=3.5,
num_inference_steps=40,
).frames[0]Error Trace
The config attributes {'clip_output': False} were passed to AutoencoderKLWan, but are not expected and will be ignored. Please verify your config.json configuration file.
Loading checkpoint shards: 100%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 5/5 [00:00<00:00, 6.70it/s]
Loading checkpoint shards: 100%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 3/3 [00:00<00:00, 33.85it/s]
Loading pipeline components...: 100%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 5/5 [00:02<00:00, 1.76it/s]
25%|โโโโโโโโโโโโโโโโโโโโ | 10/40 [00:14<00:44, 1.48s/it]
Traceback (most recent call last):
File "/data/code/haobang.geng/code/vdm2consistencyimg/workers/wan_t2v.py", line 32, in <module>
output = pipe(
^^^^^
File "/vepfs/conda_env/haobang.geng/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/data/code/haobang.geng/code/vdm2consistencyimg/workers/diffusers/src/diffusers/pipelines/wan/pipeline_wan.py", line 592, in __call__
with current_model.cache_context("cond"):
^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'cache_context
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Additional Info
- The error occurs around switch model into the inference loop (
num_inference_steps=40). - I'm using PyTorch 2.6.0+cu124 + diffusers by source code install
Suggested Fix
It seems like current_model is not properly assigned when switching to transformer2. A check or fallback may be needed before calling cache_context.
Reproduction
python xx.py
Logs
System Info
- I'm using PyTorch 2.6.0+cu124 + diffusers by source code install
Who can help?
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working