-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
I tested lpw_stable_diffusion_xl, it works with StableDiffusionXLPipeline.from_pretrained but doesn't work with StableDiffusionXLPipeline.from_single_file. I tried to delete the truncated prompt to see if it's really truncating the prompt or just an ignorable log. Here are the results:
- The prompt
open mouth,and aesthetic tags are truncated.
- After removing the truncated prompt
Here's that same prompt and settings generated with from_pretrained using the same model, but the diffuser version:
open mouth,and aesthetic tags included
- without the truncated prompts
Reproduction
pipe = StableDiffusionXLPipeline.from_single_file(
model_path,
torch_dtype=torch.float16,
custom_pipeline="lpw_stable_diffusion_xl",
use_safetensors=True,
)
pipe.to('cuda')
Logs
indices sequence length is longer than the specified maximum sequence length for this model (92 > 77). Running this sequence through the model will result in indexing errors
The following part of your input was truncated because CLIP can only handle sequences up to 77 tokens: [', open mouth, masterpiece, best quality, very aesthetic, absurdres,']System Info
diffusersversion: 0.28.0.dev0- Platform: Windows-10-10.0.22631-SP0
- Python version: 3.10.11
- PyTorch version (GPU?): 2.2.2+cu118 (True)
- Huggingface_hub version: 0.22.2
- Transformers version: 4.39.3
- Accelerate version: 0.28.0
- xFormers version: not installed
- Using GPU in script?: yes
- Using distributed or parallel set-up in script?: no
Who can help?
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working






