- 
                Notifications
    You must be signed in to change notification settings 
- Fork 6.5k
Description
Describe the bug
When running the stable-diffusion-2-1 I get a runtime warning "RuntimeWarning: invalid value encountered in cast
images = (images * 255).round().astype("uint8")" and the image output is black.

Reproduction
My code:
import torch
from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler
model_id = "stabilityai/stable-diffusion-2-1"
comment#Use the DPMSolverMultistepScheduler (DPM-Solver++) scheduler here instead
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config)
pipe = pipe.to("cuda")
prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt).images[0]
image.save("astronaut_rides_horse.png")
I got it from here: https://huggingface.co/stabilityai/stable-diffusion-2-1
Logs
Output:
PS C:\AI\diffusion> & c:/AI/diffusion/.conda/python.exe c:/AI/diffusion/main.py
model_index.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 537/537 [00:00<00:00, 556kB/s]
C:\AI\diffusion\.conda\lib\site-packages\huggingface_hub\file_download.py:149: UserWarning: `huggingface_hub` cache-system uses symlinks by default to efficiently store duplicated files but your machine does not support them in C:\Users\user\.cache\huggingface\hub\models--stabilityai--stable-diffusion-2-1. Caching files will still work but in a degraded version that might require more space on your disk. This warning can be disabled by setting the `HF_HUB_DISABLE_SYMLINKS_WARNING` environment variable. For more details, see https://huggingface.co/docs/huggingface_hub/how-to-cache#limitations.
To support symlinks on Windows, you either need to activate Developer Mode or to run Python as an administrator. In order to see activate developer mode, see this article: https://docs.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development
  warnings.warn(message)
(β¦)ature_extractor/preprocessor_config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 342/342 [00:00<?, ?B/s]
tokenizer/tokenizer_config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 824/824 [00:00<?, ?B/s]
tokenizer/special_tokens_map.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 460/460 [00:00<?, ?B/s] 
tokenizer/merges.txt: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 525k/525k [00:00<00:00, 29.9MB/s] 
scheduler/scheduler_config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 345/345 [00:00<?, ?B/s] 
tokenizer/vocab.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1.06M/1.06M [00:00<00:00, 23.0MB/s] 
text_encoder/config.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 633/633 [00:00<?, ?B/s] 
vae/config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 611/611 [00:00<?, ?B/s] 
unet/config.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 939/939 [00:00<?, ?B/s] 
diffusion_pytorch_model.safetensors: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 335M/335M [00:11<00:00, 29.8MB/s] 
model.safetensors: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1.36G/1.36G [00:31<00:00, 42.7MB/s] 
diffusion_pytorch_model.safetensors: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 3.46G/3.46G [01:17<00:00, 44.7MB/s] 
Fetching 13 files: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 13/13 [01:18<00:00,  6.07s/it] 
Loading pipeline components...: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 6/6 [00:02<00:00,  2.71it/s] 
100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 50/50 [04:57<00:00,  5.95s/it]
C:\AI\diffusion\.conda\lib\site-packages\diffusers\image_processor.py:90: RuntimeWarning: invalid value encountered in cast
  images = (images * 255).round().astype("uint8")
PS C:\AI\diffusion> ^C
PS C:\AI\diffusion>System Info
- diffusersversion: 0.26.0
- Platform: Windows-10-10.0.22621-SP0
- Python version: 3.10.13
- PyTorch version (GPU?): 2.1.0+cu121 (True)
- Huggingface_hub version: 0.20.3
- Transformers version: 4.37.2
- Accelerate version: 0.26.1
- xFormers version: not installed
- Using GPU in script?: true NVIDIA GeForce GTX 1660 TI 6gb of vram
- Using distributed or parallel set-up in script?: I dont understand this one
Who can help?
No response