LibreChat Can't Connect to Ollama #6718
Unanswered
F1zzyD
asked this question in
Troubleshooting
Replies: 1 comment 6 replies
-
You didn't share your config so I can't help you troubleshoot very well. Please review: https://www.librechat.ai/docs/quick_start/custom_endpoints endpoints:
custom:
- name: "Ollama"
apiKey: "ollama"
baseURL: "http://host.docker.internal:11434/v1/"
models:
default: [
# some defaults in case fetching the list fails
"llama3.3",
"llama3.2",
"llama3.2-vision",
"qwen2.5",
"qwen2.5-coder",
"command-r",
"mixtral",
"phi3",
"phi4",
]
fetch: true
titleConvo: true
# titleModel: "phi3"
titleModel: "current_model" # uses the currently selected model
modelDisplayLabel: "Ollama" |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Even through making the appropriate changes to the config.yaml, LibreChat does not seem to see Ollama, even though Ollama is set up correctly and reports models being ready. I have made all of the changes that have been suggested to LibreChat but nothing changes. It still won't see it.
Version Information
latest
Steps to Reproduce
docker compose up
What browsers are you seeing the problem on?
Firefox
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions