forked from ggml-org/llama.cpp
-
Notifications
You must be signed in to change notification settings - Fork 590
Closed
Description
Describe the Issue
Dear @LostRuins,
Thanks for realeasing V.101.1 and supporting Qwen Image generation models! I wanted to give it a try for it's higher prompt adherence but, though I don't have any error when loading the model + VAE/sdclips that you recommended the generated image is always black (see below).
Additional Information:
Operating System = RHEL 9.6
CPU = AMD Ryzen 9 7950X
GPUs = 2x RTX A600 (48Go VRAM)
KoboldCpp Version = 1.101
Here is the code line that I run (embedded within a systemd service):
/usr/bin/python /opt/koboldcpp-latest/koboldcpp.py /home/frillm/Models/Language/Llama-4-Scout-17B-16E-Instruct-Q6_K-00001-of-00002.gguf 8008 --multiuser 20 --highpriority --usecublas --websearch --tensor_split 41 57 --gpulayers 40 --nommap --preloadstory /opt/settings.json --sdmodel /home/frillm/Models/Images/Qwen/Qwen-Image-Edit-2509-Q4_K_S.gguf --sdvae /home/frillm/Models/Images/Qwen/qwen_image_vae.safetensors --sdclip1 /home/frillm/Models/Images/Qwen/Qwen2.5-VL-7B-Instruct.Q4_K_S.gguf --sdclip2 /home/frillm/Models/Images/Qwen/Qwen2.5-VL-7B-Instruct.mmproj-Q8_0.gguf --whispermodel /home/frillm/Models/Whisper/whisper-base.en-q5_1.bin
Here is what I get in the Kobold main UI:
Same if I use the /sdui endpoint:
Thanks for your help and best wishes,
C.
Metadata
Metadata
Assignees
Labels
No labels