Skip to content

[Feature request] Unix socket supportΒ #122

@city96

Description

@city96

The following llama.cpp PR adds support for listening on a unix socket instead of a port: ggml-org/llama.cpp#12613

Having support for this in llama-swap would free up the ports reserved by the various configured llama.cpp instances in the config, and there wouldn't be any conflicts like with ports since the socket can just be named the same as the model name.

It may also be nice if llama-swap itself can listen on a unix socket for the purpose of running it with limited network access behind a reverse proxy.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions