[Bug]: Torch unable to use RDNA3 Card #6099
Replies: 14 comments
-
yes its too new for the rocm version that torch is compiled with |
Beta Was this translation helpful? Give feedback.
-
Well, there go my dreams of 20gb of vram and crazy compute power (for now). If you or anyone else know, how long did it take for torch to update to a rocm version that supported the previous gen cards? I'd like to at least have some kind of guess for a time frame until it becomes usable again |
Beta Was this translation helpful? Give feedback.
-
Having exactly the same problem right now. |
Beta Was this translation helpful? Give feedback.
-
What compute power? Even the previous 6900XT got smoked by a 3050 out of the box with xformers in a stable diffusion benchmark becoz of the acceleration cuda provides. And that was also done in November this year which can also tell u about the support it has for RDNA2 |
Beta Was this translation helpful? Give feedback.
-
From what I can tell, ROCm has at least partial support for RDNA3, but I have no idea how complete it is. |
Beta Was this translation helpful? Give feedback.
-
Will definitely give this a look for now! |
Beta Was this translation helpful? Give feedback.
-
what guide are you following to use webui with your amd? I currently have a rx 5700 xt, I'm struggling to make it work. What guide do you recommend? |
Beta Was this translation helpful? Give feedback.
-
@ClashSAN consider transferring this to discussion since it is a torch related issue |
Beta Was this translation helpful? Give feedback.
-
there was also one on reddit ive lost the link to |
Beta Was this translation helpful? Give feedback.
-
the computing power it has. see the example of TopaZ Video Enhance AI, which makes balanced use of GPUs from NVIDIA, AMD and INTEL now that amd, nvidia and intel all have AI accelerator units (not to mention accelerators of other kinds and vendors), libraries/frameworks developers can no longer be so lazy or so bought$ |
Beta Was this translation helpful? Give feedback.
-
That is the only matter here. There is a reason why people choose nvidia GPU over amd GPU for DL despite higher prices |
Beta Was this translation helpful? Give feedback.
-
well, I mainly use TF. and yes, with my AMD... |
Beta Was this translation helpful? Give feedback.
-
I am referring to this repo not what you use in other projects |
Beta Was this translation helpful? Give feedback.
-
I do, pure CPU compute, but I manage. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Is there an existing issue for this?
What happened?
I have seen similar issues, but none specifically relating to users with new RDNA3 cards.
Following the guide to install on AMD based systems on linux, I run into the following error when launching:
One workaround mentioned by other users is adding HSA_OVERRIDE_GFX_VERSION=10.3.0 when calling launch.py to trick the system into using the gpu anyways. It worked for previous cards, but for me it abruptly segfaults.
(maybe I missed something in the wiki but i can't find any kind of log for said dump).
adding the --skip-torch-cuda-test causes stable diffusion to only use the CPU and is agonizingly slow.
The main line I noticed is "Warning: caught exception 'No HIP GPUs are available', memory monitor disabled." I think that this means the error comes from torch being unable to detect this card. rocminfo shows it as being properly detected by the system
Is this caused by RDNA3 cards simply being too new and not yet supported by torch?
Steps to reproduce the problem
Follow the steps to install for AMD GPUs, but with a new RDNA3 card (specifically a 7900XT)
What should have happened?
GPU being recognized at all / program not segfaulting
Commit where the problem happens
c6f347b
What platforms do you use to access UI ?
Linux
What browsers do you use to access the UI ?
Mozilla Firefox
Command Line Arguments
Additional information, context and logs
System previously had a 2060 installed, but I removed it and rebuilt the stable-diffusion-webui folder for the new card. Worked flawlessly with said 2060.
Beta Was this translation helpful? Give feedback.
All reactions