AMD RX 9060XT ROCm error: invalid device function #6008
Unanswered
cybershaman
asked this question in
Q&A
Replies: 1 comment 1 reply
-
I think it's a matter of specifying more amd gpu targets - however that might take too much for the CI to digest. This is the relevant point in the backend: LocalAI/backend/cpp/llama-cpp/Makefile Line 35 in 90b5ed9 If you can test it by specifying the GPU targets there would be good, but I think that we need to add |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello all!
Been hitting my head against this issue for some time now so though I might reach out to the community here for advice :-)
Host - Proxmox 8.4 Bare Metal Install
Container (LXC) - Ubuntu 22.04.5 LTS
LocalAI compiled from git source:
REBUILD=true BUILD_TYPE=hipblas GPU_TARGETS=gfx1200 GO_TAGS=stablediffusion,tts BUILD_GRPC_FOR_BACKEND_LLAMA=true BUILD_GRPC=true make build
HSA_OVERRIDE_GFX_VERSION=12.0.0
just in caseLocalAI appears to be recognising and utilizing the GPU as there is VRAM movement and a tiny bit of GPU usage visible while querying the API.
However, throws an error eventually:
Anyone have any ideas and/or pointers?
Thank you very much in advance!
Beta Was this translation helpful? Give feedback.
All reactions