llama.cpp support #1668
icoicqico
started this conversation in
1. Feature requests
Replies: 1 comment
-
Hello guys, I too have issues with lm studio, but I found out that kilo was not requesting data using streaming. Doing so lead the request to be so long that it never ends in a reasonable time with my hardware (4070 Ti, 64Gb ram). Regards |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I think llama.cpp is one of the common way to host llm locally, so it is good to have llama.cpp support. I have tried to use ollama, Lm studio settings, it doesn't work. Thanks
Beta Was this translation helpful? Give feedback.
All reactions