Skip to content

FluxImg2ImgPipeline produces output almost identical to the input imageΒ #10210

@CyberVy

Description

@CyberVy

Describe the bug

When providing an input image and a prompt, the resulting output image is nearly identical to the input image, regardless of how I adjust the parameters. The differences are so minor that they are almost imperceptible.

Even after setting the strength parameter to a very high value, the output still doesn't seem to reflect the given prompt. It appears as though the prompt is not being considered in the generation process.

Reproduction

from diffusers import FluxPipeline,FluxImg2ImgPipeline
import torch


# the transformer model is from https://huggingface.co/Kijai/flux-fp8/blob/main/flux1-dev-fp8.safetensors
pipeline = FluxPipeline(...)
pipeline.enable_model_cpu_offload()

prompt = "a cat with wing flying in the sky"
num_inference_steps = 25
guidance_scale = 3
seed = 15827082008518028345
image = pipeline(prompt=prompt,num_inference_steps=num_inference_steps,guidance_scale=guidance_scale,
                 generator=torch.Generator(pipeline.device).manual_seed(seed)).images[0]


i2i_pipeline = FluxImg2ImgPipeline(**pipeline.components)
i2i_pipeline.enable_model_cpu_offload()

i2i_prompt = "in a forest"
i2i_seed = 888
strength = 0.7
i2i_image = i2i_pipeline(image=image,prompt=i2i_prompt,strength=strength,num_inference_steps=num_inference_steps,guidance_scale=guidance_scale,
                         generator=torch.Generator(i2i_pipeline.device).manual_seed(i2i_seed)).images[0]

image:
Colab (44)
i2i_image:
Colab (45)

As you can see, the result from image to image pipeline is quite simliar to the input image, however, the strength is 0.7.

Logs

No response

System Info

  • πŸ€— Diffusers version: 0.32.0.dev0
  • Platform: Linux-6.6.56+-x86_64-with-glibc2.35
  • Running on Google Colab?: No
  • Python version: 3.10.14
  • PyTorch version (GPU?): 2.4.0 (True)
  • Flax version (CPU?/GPU?/TPU?): 0.8.4 (gpu)
  • Jax version: 0.4.26
  • JaxLib version: 0.4.26.dev20240620
  • Huggingface_hub version: 0.26.2
  • Transformers version: 4.46.3
  • Accelerate version: 1.1.1
  • PEFT version: 0.14.0
  • Bitsandbytes version: not installed
  • Safetensors version: 0.4.5
  • xFormers version: not installed
  • Accelerator: Tesla T4, 15360 MiB
    Tesla T4, 15360 MiB
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: No

Who can help?

@yiyixuxu @DN6

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions