Skip to content

Custom Wan diffusion Lora runs without error but doesn't apply effect and gives warning: No LoRA keys associated to WanTransformer3DModel found with the prefix='transformer'. #11657

@st-projects-00

Description

@st-projects-00

Describe the bug

I run the diffusers pipe using the standard process with a custom diffusers trained lora:

pipe = WanPipeline.from_pretrained(model_id, vae=vae, torch_dtype=torch.bfloat16)
pipe.scheduler = scheduler
pipe.load_lora_weights("lora/customdiffusers_lora.safetensors")
etc...

it runs without error but the effect was not applied, and I see the following warning:
No LoRA keys associated to WanTransformer3DModel found with the prefix='transformer'. This is safe to ignore if LoRA state dict didn't originally have any WanTransformer3DModel related params. You can also try specifying prefix=None to resolve the warning. Otherwise, open an issue if you think it's unexpected: https://github.com/huggingface/diffusers/issues/new

Is there any config file I need to change for this to work? Thanks

Reproduction

N/A as a custom Lora

Logs

System Info

0.33, linux, python 3.10

Who can help?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions