Skip to content

Conversation

@mostlygeek
Copy link
Contributor

@mostlygeek mostlygeek commented Jan 1, 2025

Add the llama-swap proxy to the README to make it easier for people to discover. llama-swap is a transparent proxy which will swap llama-server to another configuration to serve the requested model dynamically.

@ggerganov ggerganov merged commit a45433b into ggml-org:master Jan 2, 2025
2 checks passed
tinglou pushed a commit to tinglou/llama.cpp that referenced this pull request Feb 13, 2025
* list llama-swap under tools in README

* readme: add llama-swap to Infrastructure
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Feb 26, 2025
* list llama-swap under tools in README

* readme: add llama-swap to Infrastructure
mglambda pushed a commit to mglambda/llama.cpp that referenced this pull request Mar 8, 2025
* list llama-swap under tools in README

* readme: add llama-swap to Infrastructure
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants