Skip to content
Discussion options

You must be logged in to vote

when Ollama is installed using docker it works fine with librechat but with no GPU acceleration

Depending on your graphics card, you can add the resources as shown here:
https://www.librechat.ai/blog/2024-03-02_ollama

# docker-compose.override.yml
version: '3.4'
 
services:
# USE LIBRECHAT CONFIG FILE
  api:
    volumes:
    - type: bind
      source: ./librechat.yaml
      target: /app/librechat.yaml
 
# ADD OLLAMA
  ollama:
    image: ollama/ollama:latest
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              capabilities: [compute, utility]
    ports:
      - "11434:11434"
    volumes:
      - ./ollama:/root/.ollama

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
2 replies
@danny-avila
Comment options

@SamDc73
Comment options

Answer selected by SamDc73
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants