-
Under the 3 enabled API's, I see entries correlating to only 2 of them. Kobold and OpenAI. Which is the one for Ollama? |
Beta Was this translation helpful? Give feedback.
Answered by
LostRuins
Jun 2, 2025
Replies: 1 comment 5 replies
-
Ollama uses the You only need to point the tool to port 5001 and it should work. Alternatively, run KoboldCpp at port 11434 and it should work out of the box with your tool (that is the ollama default port) |
Beta Was this translation helpful? Give feedback.
5 replies
Answer selected by
TFWol
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Ollama uses the
/api/chat
endpoint - generally this is set automatically by whatever software you are using.You only need to point the tool to port 5001 and it should work.
Alternatively, run KoboldCpp at port 11434 and it should work out of the box with your tool (that is the ollama default port)