Ollama not working with librechat #3030
-
When sending a message to ollama using librechat I get this back:
but when running
everything runs as expected, and the command line ollama is running fine my librechat.yaml file
what I tried:
additional information:
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Depending on your graphics card, you can add the resources as shown here: # docker-compose.override.yml
version: '3.4'
services:
# USE LIBRECHAT CONFIG FILE
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yaml
# ADD OLLAMA
ollama:
image: ollama/ollama:latest
deploy:
resources:
reservations:
devices:
- driver: nvidia
capabilities: [compute, utility]
ports:
- "11434:11434"
volumes:
- ./ollama:/root/.ollama |
Beta Was this translation helpful? Give feedback.
-
The LibreChat service is trying to access ollama using a host or IP address relative to the docker container it's running in. In this case localhost is the container and not your host machine. You need to reference the host machine by referring to it as: In this case avoid installing ollama in a container and make use of the actual host for acceleration. |
Beta Was this translation helpful? Give feedback.
Depending on your graphics card, you can add the resources as shown here:
https://www.librechat.ai/blog/2024-03-02_ollama