[Script request] llama.cpp (standalone or as part of openwebui installer) #2403
Unanswered
xijio
asked this question in
Request script
Replies: 2 comments 1 reply
-
Is this really active? Because we Crawl only releases, and Last Release is 6month ago |
Beta Was this translation helpful? Give feedback.
1 reply
-
Having llama.cpp-vulkan would be a huge benefit with GPU passthrough and the vulkan drivers. People seem to be getting good performance with old cheap used cards on local llms recently with them. I think it would be a great addition. Ollama has no vulkan support. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Application Name
llama.cpp
Website
https://github.com/ggerganov/llama.cpp
Description
llama.cpp is a C++ implementation of ollama APIs that is faster and more efficient on many hardware setups. It would be awesome to be able to deploy this as part of the openwebui installer like we can for ollama now, or to install it in a seperate LXC using a new script
Due Diligence
Beta Was this translation helpful? Give feedback.
All reactions