Online Direct Model Loading #3001
EponymousGithubCat
started this conversation in
General
Replies: 2 comments 1 reply
-
prune your model from 4gb to 2gb if you'd like. This colab would read from gdrive, only needs to install once. |
Beta Was this translation helpful? Give feedback.
1 reply
-
可以使用谷歌云盘,将模型放在不同账户的磁盘中,然后共享到同一个文件夹,然后挂载磁盘即可使用 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Sorry if this has been asked before but I'm really curious about something. So CKPT files are enormous- usually around 4 GB. I can only download two into my google drive before running out of space (and one of them has to be the normal SD 1.4), and it's a pain to have to download this huge file any time I want to use a new model.
So my question is, is it possible to have a database somewhere of CKPT files that the webUI can load directly from, instead of having to load the files locally from your Gdrive or computer? Like in the model selection option in the webUI settings I can click an "open library" button, which opens this database up and I can search and select the model from it. Then it can load in the model, optimally without having to download it first (or at the very least, for the Colab version, download the model into the runtime colab files like how it works in some of the basic SD colabs).
Any thoughts?
Beta Was this translation helpful? Give feedback.
All reactions