How to pass system prompt in the llama.cpp server #14671
Unanswered
engrtipusultan
asked this question in
Q&A
Replies: 1 comment
-
any modern GGUF that you download should have the prompt template stored inside it, Llama.cpp can read this in and use it without you having to manually define it. check this thread for more detail r/locallama |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am going through the documentation. I cannot find how to pass the system prompt when initializing the server.
https://github.com/ggml-org/llama.cpp/blob/master/tools/server/README.md
Can someone guide me, how to pass the system prompt for the server ?
Beta Was this translation helpful? Give feedback.
All reactions