Skip to content
Discussion options

You must be logged in to vote

Finally, I found the soultion by myself.

Troubleshooting Log: Connecting Local Ollama to LibreChat (v0.7.7) on Docker

Goal

Configure a Dockerized LibreChat (v0.7.7) instance to use models hosted on a local Ollama server.

Initial State

  • LibreChat (v0.7.7) running via Docker Desktop on a Windows environment.
  • Local Ollama server installed with necessary models (e.g., llama3:8b-instruct-q4_K_M, mistral:7b) downloaded.
  • LibreChat UI, by default, only shows external API models like OpenAI.

Troubleshooting Process

1. Issue: Ollama Endpoint Configuration Not Applied

  • Symptom: Modifying the .env file to add Ollama settings or changing existing endpoint variables has no effect on the UI. Only defau…

Replies: 3 comments 6 replies

Comment options

You must be logged in to vote
2 replies
@danny-avila
Comment options

@neverresting77
Comment options

Answer selected by danny-avila
Comment options

You must be logged in to vote
3 replies
@tandryukha
Comment options

@CyprienGille
Comment options

@kwadwoadu
Comment options

Comment options

You must be logged in to vote
1 reply
@danny-avila
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
6 participants