Skip to content

Conversation

Copilot
Copy link
Contributor

@Copilot Copilot AI commented Sep 5, 2025

Thanks for asking me to work on this. I will get started on it and keep this PR's description up to date as I form a plan and make progress.

Original description:

Add a --cli option to llama-server that also forks a linenoise.cpp based cli process on launch of llama-server (use existing linenoise.cpp in the repo). This forked process acts as a client that speaks to the /chat/completions API in llama-server. Ctrl-D should terminate both processes when invoked in --cli mode. Ctrl-C should stop/interrupt the chatbot response when pressed so we can ask another query, when pressed while in the user prompt it remind the user Ctrl-D is to quit. Here is a reference implementation in python3 of what the forked --cli process should look like:

@ericcurtin/lm-chat/files/lm-chat.py


💬 Share your feedback on Copilot coding agent for the chance to win a $200 gift card! Click here to start the survey.

@ericcurtin ericcurtin closed this Sep 5, 2025
@Copilot Copilot AI requested a review from ericcurtin September 5, 2025 10:35
@CISC CISC deleted the copilot/fix-08a581ef-1482-4dea-8494-f9f717e8982a branch September 19, 2025 11:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants