[Question]: Custom AI Endpoints Ollama API key: Required but ignored Errors when running the Ollama chat models #3888
Replies: 3 comments 8 replies
-
What's your config look like? did you see this? https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/ollama |
Beta Was this translation helpful? Give feedback.
-
My next plan was to run this on a server. will i have the same issues? can ollama models be added another way or already included like the other models? |
Beta Was this translation helpful? Give feedback.
-
I found solution for myself. In case if you use LibreChat in docker, but install Ollama manually.
in librechat.yaml: baseURL: "http://host.docker.internal:11434/v1/chat/completions" #To make container access to your host By default Ollama uses default allowed hosts: 127.0.0.1 or localhost. But from docker host became other 172.21.0.1 (in my case for example) and Ollama not allow this. Try to change Ollama host to 0.0.0.0, that means allow all connections. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What is your question?
LibreChat | 2024-09-01 21:43:28 error: Failed to fetch models from Ollama API. If you are not using Ollama directly, and instead, through some aggregator or reverse proxy that handles fetching via OpenAI spec, ensure the name of the endpoint doesn't start with
ollama
(case-insensitive). Cannot read properties of undefined (reading 'status')LibreChat | 2024-09-01 21:43:29 error: Failed to fetch models from Mistral API
LibreChat | The request either timed out or was unsuccessful. Error message:
LibreChat | Cannot read properties of undefined (reading 'status')
I am trying to use Ollama Local, but I would also like to run in produciton also with Ollama, any assistance will be greatly appreciated.
Trying to get your new Artifacts feature running with local models.
More Details
LibreChat | 2024-09-01 21:43:28 error: Failed to fetch models from Ollama API. If you are not using Ollama directly, and instead, through some aggregator or reverse proxy that handles fetching via OpenAI spec, ensure the name of the endpoint doesn't start with
ollama
(case-insensitive). Cannot read properties of undefined (reading 'status')LibreChat | 2024-09-01 21:43:29 error: Failed to fetch models from Mistral API
LibreChat | The request either timed out or was unsuccessful. Error message:
LibreChat | Cannot read properties of undefined (reading 'status')
What is the main subject of your question?
Endpoints
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions