-
-
Notifications
You must be signed in to change notification settings - Fork 157
-
|
Hi, In the latest version of node-llama-cpp, the GPU is no longer detected — it automatically falls back to CPU. Here are the device detection logs for comparison: Old version (works): OS: Windows 10.0.26100 (x64) Vulkan: available Vulkan device: Intel(R) Arc(TM) Pro Graphics CPU model: Intel(R) Core(TM) Ultra 9 185H New version (fails): OS: Windows 10.0.26100 (x64) node-llama-cpp: 3.14.0 Vulkan: available Vulkan used VRAM: 0% (768MB/0B) CPU model: Intel(R) Core(TM) Ultra 9 185H Thanks for your help! |
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 1 comment · 3 replies
-
|
@gy9527 Thanks for reporting this! Can you please run these two commands and attach their output? npx --yes [email protected] inspect gpu
npx --yes [email protected] inspect gpuThe first one checks for a recent change in The second one is before a fix to the Vulkan memory reading that I made in If you can also install the Vulkan SDK and then run Based on the outputs of these I can asses the issue better and provide a fix that you could test. |
Beta Was this translation helpful? Give feedback.
All reactions
-
|
npx --yes [email protected] inspect gpu
npx --yes [email protected] inspect gpu
vulkaninfo
|
Beta Was this translation helpful? Give feedback.
All reactions
-
|
@gy9527 Thanks for helping me investigate this issue! |
Beta Was this translation helpful? Give feedback.
All reactions
-
|
@giladgd I've just released a new version of |
Beta Was this translation helpful? Give feedback.
@giladgd I've just released a new version of
node-llama-cppwith the fix (3.14.2).Give it a try and let me know whether it works for you.