Cannot generate images as python3 is eating all the memory #13781
LordMilutin
started this conversation in
Optimization
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have installed everything according to the manual on Linux.
However, I cannot generate images as it shows:
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 26.00 MiB (GPU 0; 7.78 GiB total capacity; 6.86 GiB already allocated; 17.69 MiB free; 7.04 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
I have NVIDIA GeForce RTX 3050 with 8GB of RAM.
When this is turned on, python3 eats 7326MiB of memory, but when I turn it off, I only have about 500MiB in total system usage.
Is there any option for me to reduce in half the python3 memory consumption?
I passed --medvram env, and I tried --xformers, but it is unable to load, probably something with wrong versions, as I am running CUDA 12.2 and XFormers is 12.1.
Any info would be greatly appreciated!
Beta Was this translation helpful? Give feedback.
All reactions