Skip to content

Conversation

leejet
Copy link
Owner

@leejet leejet commented Sep 10, 2025

It looks like Flash Attention can now be used in SD3.

 .\bin\Release\sd.exe -m ..\models\sd3_medium_incl_clips_t5xxlfp16.safetensors -p "a lovely cat" --cfg-scale 4.5 --sampling-method euler -v --clip-on-cpu --diffusion-fa
output

@leejet leejet merged commit b017918 into master Sep 10, 2025
8 checks passed
@wbruna
Copy link
Contributor

wbruna commented Sep 10, 2025

It works in the sense that it doesn't crash or mess up the image, but I kept the warning because it doesn't do anything either: the diffusion_flash_attn parameter isn't even being forwarded to the diffusion model.

@Green-Sky
Copy link
Contributor

Right, I think I did not wire up sd3.

@leejet
Copy link
Owner Author

leejet commented Sep 10, 2025

Check this PR #815

@leejet leejet deleted the rm-sd3-fa-warn branch September 16, 2025 15:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants