Skip to content
Discussion options

You must be logged in to vote

Ollama uses the /api/chat endpoint - generally this is set automatically by whatever software you are using.

You only need to point the tool to port 5001 and it should work.

Alternatively, run KoboldCpp at port 11434 and it should work out of the box with your tool (that is the ollama default port)

Replies: 1 comment 5 replies

Comment options

You must be logged in to vote
5 replies
@TFWol
Comment options

@TFWol
Comment options

@LostRuins
Comment options

@LostRuins
Comment options

@TFWol
Comment options

Answer selected by TFWol
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants