fix: CUDA out of memory error on Windows#23
fix: CUDA out of memory error on Windows#23michaelgold wants to merge 1 commit intowilliamyang1991:mainfrom
Conversation
|
i am using but i guess those could be using lesser VRAM i have RTX 3090 so 24 gb VRAM |
|
I have a RTX 3080, and to clarify, I was only getting the out of memory errors without xformers installed. Xformers with Pytorch 2 did not work correctly for me, but I was able to get things working with the earlier version. |
pip install xformers==0.0.16
# install compatible version of pytorch
pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu117
|
|
I had to do the following: pip install xformers==0.0.21 and then there was a strange conflict with the libiomp5md.dll that i have resolved by removing the duplicate dll: Remove X:\miniconda3\envs\rerenderX\Library\bin\libiomp5md.dll |
Can I ask if it works as long as I download and install xformer? |
I look forward for your tutorials on this!! |
|
I have auto installer working amazing - 1 click hopefully will make a tutorial too https://www.patreon.com/posts/1-click-auto-for-89457537 6r8SWXGPgDz4jCKw.mp4 |
|
So just to clarify, this can work on cards with less than 24GB Vram? Like a 16GB 4060ti? |
|
Yes. Runs for me on a 3080 |
Added Windows installation instructions for xformers==0.0.16 and torch==1.13.1+cu117
This resolves the CUDA out of memory errors on Windows