No server/localhost address #347
ElleSimpsonEdin
started this conversation in
General
Replies: 2 comments 6 replies
-
You're using 7GB model, free colab doesn't have enough ram for that, try the argument --lowram, if you're lucky, it will work |
Beta Was this translation helpful? Give feedback.
0 replies
-
I'm on a T4 on Colab Pro. 16gb should be fine no? I mean I can run this model fine on local on an 8gb RTX2070 |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I've been trying to run this to train a model, but having no luck even launching.
The last line of the "Start stable-diffusion" only outputs: "^C" instead of an address to connect to.
Beta Was this translation helpful? Give feedback.
All reactions