Support for AMD GPUs #184
Replies: 3 comments
-
I think I may have found the issue. I much have fat fingered it when I put it in: if device == "rocm" {
device = "cuda"
} I just pushed a new commit to the branch. Let me know if it works. |
Beta Was this translation helpful? Give feedback.
-
I updated my "working" branch with your fix and rebuilt, and again tested the same command with both 'cuda' and 'rocm'. 'cuda' was the same (
|
Beta Was this translation helpful? Give feedback.
-
Any progress on figuring out the issue, or just currently busy irl / with other stuff? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
This thread tracks the development of adding rocm support to run transcription on AMD GPUs
Beta Was this translation helpful? Give feedback.
All reactions