Out of Memory error when RAM is available #10182
Unanswered
somearthling
asked this question in
Q&A
Replies: 1 comment 2 replies
-
OOM can refer to GPU out of memory. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm running some large batches on a personal computer, with 16GB RAM and 4GB of VRAM, but I've run into a problem: When I run my code through the CUDA-enabled jaxlib, I run into an error where it says it couldn't allocate enough memory, and my VRAM seems to have topped out at 3.6 GB, but I'm only using about half my RAM. The code runs fine when I don't use CUDA and run it purely on my CPU. Concurrently to this, when running smaller batches, the first iteration takes much longer(3x) on the CUDA enabled version than the CPU version, while later iterations run faster on the GPU by about 2x. Is there any way to get the best of both worlds, and also avoid the memory error with larger batches?
Beta Was this translation helpful? Give feedback.
All reactions