Skip to content

How to add flux1-fill-dev-fp8.safetensors #11418

@SlimRG

Description

@SlimRG

Describe the bug

Hi!
How to use flux1-fill-dev-fp8.safetensors in diffusers?

Now I have code:

def init_pipeline(device: str):
    logger.info(f"Loading FLUX Inpaint Pipeline (Fill‑dev) on {device}")
    pipe = FluxFillPipeline.from_pretrained(
        "black-forest-labs/FLUX.1-Fill-dev",
        torch_dtype=torch.bfloat16,
        trust_remote_code=True
    ).to(device)
    logger.info("Pipeline loaded successfully")
    return pipe

Another try:

 transformer = FluxTransformer2DModel.from_single_file(
        "https://huggingface.co/YarvixPA/FLUX.1-Fill-dev-gguf/blob/main/flux1-fill-dev-Q4_0.gguf",
        quantization_config=GGUFQuantizationConfig(compute_dtype=torch.bfloat16),
        torch_dtype=torch.bfloat16
    )

    pipe = FluxFillPipeline.from_pretrained(
        "black-forest-labs/FLUX.1-Fill-dev",
        transformer=transformer,
        torch_dtype=torch.bfloat16,
        trust_remote_code=True
    ).to(device)

    pipe.enable_model_cpu_offload()

Reproduction

https://huggingface.co/boricuapab/flux1-fill-dev-fp8/blob/main/README.md
https://huggingface.co/pengxian/diffusion_models/blob/main/flux1-fill-dev_fp8.safetensors

Logs

System Info

Windows 11
Python 11

Who can help?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions