Replies: 1 comment
-
OK ... so there was a bug with detection for flux unet in the code of extract-unet-safetensor So i gave it another try and it works in Forge!
You end up with a 12G flux1-dev-fp8_unet.safetensors (against 17.2G of the original flux1-dev-fp8.safetensors) so it's 5Go freed ... imagine doing this with all our SD/SDXL checkpoints! Then you put this flux1-dev-fp8_unet.safetensors in your Stable-diffusion/models folder and you select it for inference in Forge plus you select VAE and Text Encoders (VAE : flux1-dev-vae-float16.safetensors / Text Encoders : flux1-dev-clip_l.safetensors + flux1-dev-t5xxl_fp16.safetensors) Enter a prompt, hit generate ... profit! Not sure about how to make it work with SDXL models, tried before and ended up with errors ... will check later. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there.
it could be interesting to have the code in Forge in order to use these types of checkpoints that are shipped "UNET Only" and are way lighter.
Related to this there is a project on github to extract UNET from checkpoints :
https://github.com/captainzero93/extract-unet-safetensor
Just imagine the space freed on our hard drives ...
Beta Was this translation helpful? Give feedback.
All reactions