Skip to content

AMD install does not detect Rocm and just errors out asking for CUDA #1058

@BrechtCorbeel

Description

@BrechtCorbeel

What happened?

I switched my 4090 to my Linux server and added my 7900XTX to my main machine as the 4090 just works and runs better, even with the AMD install for Comfy it did not run on Linux cannot get it to run on Windows either, I just get stuck being asked for CUDA:

Traceback (most recent call last):
File "W:\smatrix\Data\Packages\ComfyUIAMD\main.py", line 132, in
import execution
File "W:\smatrix\Data\Packages\ComfyUIAMD\execution.py", line 13, in
import nodes
File "W:\smatrix\Data\Packages\ComfyUIAMD\nodes.py", line 22, in
import comfy.diffusers_load
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\diffusers_load.py", line 3, in
import comfy.sd
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\sd.py", line 6, in
from comfy import model_management
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\model_management.py", line 145, in
total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\model_management.py", line 114, in get_torch_device
return torch.device(torch.cuda.current_device())
File "W:\smatrix\Data\Packages\ComfyUIAMD\venv\lib\site-packages\torch\cuda_init_.py", line 878, in current_device
lazy_init()
File "W:\smatrix\Data\Packages\ComfyUIAMD\venv\lib\site-packages\torch\cuda_init
.py", line 305, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

Steps to reproduce

No response

Relevant logs

No response

Version

v 2.14.4

What Operating System are you using?

Windows

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions