error when trying to run on colab #3094
Replies: 2 comments
-
Try using: --opt-split-attention-invokeai if that doesn't work, switch colabs to TheLastBen's Fast Stable Diffusion. This particular colab will save the repo to your google drive and also system link some cached files so that it loads way quicker after running the first time. (It will use about 7-10 GB to store all the models (even those downloaded whilst running the UI, hence the fast start up). Ensure that your Colab is set to GPU. Add a code cell after the 'Installing AUTOMATIC1111 repo' cell, and paste in this: !wget https://huggingface.co/hakurei/waifu-diffusion-v1-3/resolve/main/wd-v1-3-float32.ckpt -O /content/gdrive/MyDrive/sd/stable-diffusion-webui/models/Stable-diffusion/wd-v1-3-float32.ckpt Optional:
Run all the cell's That will download the Waifu model I presume you are wanting to use. The Waifu model will be in the model list within the UI settings if it has been downloaded. A quick hack to load the model directly, add a cell directly after 'Installing xformers' and paste: See how that goes. |
Beta Was this translation helpful? Give feedback.
-
you've probably seen it and ignored it "You cannot currently connect to a GPU due to usage limits in Colab." wait 12 hours. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
i get this error when it's done installing the float32 ckpt
Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check
any fix? im running this one https://colab.research.google.com/github/camenduru/stable-diffusion-webui-colab/blob/main/waifu_diffusion_webui_colab.ipynb#scrollTo=06AIpiSp0OX1
Beta Was this translation helpful? Give feedback.
All reactions