Can't chat with ollama models #8508
Unanswered
AzizZayed
asked this question in
Troubleshooting
Replies: 1 comment 2 replies
-
Ollama is working on my end. ![]() Please change: baseURL: "http://ollama:11434/api/" to: baseURL: "http://ollama:11434/v1/" More info: |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I getht he following error when chating with a loaded ollama model:
In the error logs:
Version Information
ghcr.io/danny-avila/librechat-rag-api-dev latest 64a31a1f9483 6 days ago 7.87GB
ghcr.io/danny-avila/librechat latest 3d0aff522ece 2 weeks ago 1.11GB
Steps to Reproduce
docker-compose.override.yml:
librechat.yaml:
In the docs, it says to use
baseURL: "http://host.docker.internal:11434/v1/"
but that did not work. I changed it tobaseURL: "http://ollama:11434/api/"
and the model fetching worked but chatting still does not work.What browsers are you seeing the problem on?
Microsoft Edge
Relevant log output
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions