File tree Expand file tree Collapse file tree 1 file changed +2
-5
lines changed Expand file tree Collapse file tree 1 file changed +2
-5
lines changed Original file line number Diff line number Diff line change @@ -30,18 +30,15 @@ cd llama.cpp && LLAMA_CURL=1 make
3030Once installed, you can use the ` llama-cli ` or ` llama-server ` as follows:
3131
3232``` bash
33- llama-cli
34- llama-cli -hf bartowski/Llama-3.2-3B-Instruct-GGUF:Q8_0
35- -p " You are a helpful assistant" -cnv
33+ llama-cli -hf bartowski/Llama-3.2-3B-Instruct-GGUF:Q8_0
3634```
3735
3836Note: You can remove ` -cnv ` to run the CLI in chat completion mode.
3937
4038Additionally, you can invoke an OpenAI spec chat completions endpoint directly using the llama.cpp server:
4139
4240``` bash
43- llama-server \
44- -hf bartowski/Llama-3.2-3B-Instruct-GGUF:Q8_0
41+ llama-server -hf bartowski/Llama-3.2-3B-Instruct-GGUF:Q8_0
4542```
4643
4744After running the server you can simply utilise the endpoint as below:
You can’t perform that action at this time.
0 commit comments