You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have LibreChat and Ollama hosted locally in Docker on my linux instance. RAG has been set up and uploading files as a part of a Ollama prompt has been working as expected. This changed as soon as I decided to add Open AI as another provider in my LibreChat environment. Even though RAG is configured as follows:
As soon as OpenAI was enabled, file uploads stopped working to Ollama. Here's the message in the logs:
2025-01-10 14:15:24 error: [OpenAIClient.chatCompletion] Unhandled error type "nomic-embed-text:latest" does not support chat
2025-01-10 14:15:24 error: [handleAbortError] AI response error; aborting request: "nomic-embed-text:latest" does not support chat
It appears as soon as Open AI was added the EMBEDDINGS_PROVIDER and other fields are being overrided, although clearly not the EMBEDDINGS_MODEL.
Does anyone have any thoughts on where this override is configured? I'd very much like to have LibreChat as the primary AI UI, access the local Ollama and access OpenAI and xAI as needed, however want file uploaded processed with Ollama.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I have LibreChat and Ollama hosted locally in Docker on my linux instance. RAG has been set up and uploading files as a part of a Ollama prompt has been working as expected. This changed as soon as I decided to add Open AI as another provider in my LibreChat environment. Even though RAG is configured as follows:
RAG_API_URL=http://host.docker.internal:8000
EMBEDDINGS_PROVIDER=ollama
OLLAMA_BASE_URL=http://ollama:11434
EMBEDDINGS_MODEL=nomic-embed-text
As soon as OpenAI was enabled, file uploads stopped working to Ollama. Here's the message in the logs:
2025-01-10 14:15:24 error: [OpenAIClient.chatCompletion] Unhandled error type "nomic-embed-text:latest" does not support chat
2025-01-10 14:15:24 error: [handleAbortError] AI response error; aborting request: "nomic-embed-text:latest" does not support chat
It appears as soon as Open AI was added the EMBEDDINGS_PROVIDER and other fields are being overrided, although clearly not the EMBEDDINGS_MODEL.
Does anyone have any thoughts on where this override is configured? I'd very much like to have LibreChat as the primary AI UI, access the local Ollama and access OpenAI and xAI as needed, however want file uploaded processed with Ollama.
Beta Was this translation helpful? Give feedback.
All reactions