Skip to content

Kandinsky5TimeEmbeddings hardcodes 'cuda' in @torch.autocast decorator, causing warning on non-CUDA systemsΒ #12809

@knd0331

Description

@knd0331

Body:

Describe the bug

When importing diffusers on a non-CUDA system (e.g., Apple Silicon Mac with MPS), a warning is
emitted:

/torch/amp/autocast_mode.py:270: UserWarning: User provided device_type of 'cuda', but CUDA is not
available. Disabling
warnings.warn(

This occurs because Kandinsky5TimeEmbeddings class has a hardcoded "cuda" device type in the
@torch.autocast decorator.

Location

File: diffusers/models/transformers/transformer_kandinsky.py
Line: 168

@torch.autocast(device_type="cuda", dtype=torch.float32)
def forward(self, timestep):

Root Cause

The decorator is evaluated at import time, not at runtime. On systems without CUDA (like Apple
Silicon Macs using MPS), this triggers the warning even though the Kandinsky model may never be
used.

Reproduction

# On a Mac with Apple Silicon (no CUDA)
from diffusers import ZImagePipeline  # or any pipeline
# Warning appears immediately on import

Expected behavior

No warning should appear when importing diffusers on non-CUDA systems.

Environment

- OS: macOS (Apple Silicon M-series)
- Python: 3.13
- PyTorch: 2.x (MPS backend)
- Diffusers: latest

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions