-
-
Notifications
You must be signed in to change notification settings - Fork 150
-
|
Hi, In the latest version of node-llama-cpp, the GPU is no longer detected — it automatically falls back to CPU. Here are the device detection logs for comparison: Old version (works): OS: Windows 10.0.26100 (x64) Vulkan: available Vulkan device: Intel(R) Arc(TM) Pro Graphics CPU model: Intel(R) Core(TM) Ultra 9 185H New version (fails): OS: Windows 10.0.26100 (x64) node-llama-cpp: 3.14.0 Vulkan: available Vulkan used VRAM: 0% (768MB/0B) CPU model: Intel(R) Core(TM) Ultra 9 185H Thanks for your help! |
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 1 comment · 2 replies
-
|
@gy9527 Thanks for reporting this! Can you please run these two commands and attach their output? npx --yes [email protected] inspect gpu
npx --yes [email protected] inspect gpuThe first one checks for a recent change in The second one is before a fix to the Vulkan memory reading that I made in If you can also install the Vulkan SDK and then run Based on the outputs of these I can asses the issue better and provide a fix that you could test. |
Beta Was this translation helpful? Give feedback.
All reactions
-
|
npx --yes [email protected] inspect gpu
npx --yes [email protected] inspect gpu
vulkaninfo
|
Beta Was this translation helpful? Give feedback.
All reactions
-
|
@gy9527 Thanks for helping me investigate this issue! |
Beta Was this translation helpful? Give feedback.
@gy9527 Thanks for helping me investigate this issue!
Based on these logs I think that my #516 PR will fix the issue you experience.
I'll release it in the next few days and let you know when it's out so you can test it.