flux: flux1-schnell-fp8.safetensors weight_dtype with fp8 or fp16? (mac book) #4590
Unanswered
movelikeriver
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
unet is

flux1-schnell-fp8.safetensors
, but theweight_dtype
can't choose to fp8 due to MPS (mac book), has to bedefault
(model weight dtype torch.bfloat16, manual cast: None), is it an issue?Beta Was this translation helpful? Give feedback.
All reactions