Ollama Endpoint Not Showing in LibreChat UI Despite Correct Setup (Windows + Docker) #6779
-
Goal: Environment:
Problem: Troubleshooting Steps Performed:
Current Status: Request: |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 6 replies
-
Beta Was this translation helpful? Give feedback.
-
I solved it following these steps:
|
Beta Was this translation helpful? Give feedback.
-
Discussion #6779: Ollama Endpoint Not Showing (Chiming in)Experiencing the same issue on Garuda Linux (Arch-based) with LibreChat v0.7.9 Docker and native Ollama 0.2.9. LibreChat running in Docker (v0.7.9) What I've Tried 1) .env configuration:
Also tried: 2) Verified connectivity: From host: 3) docker-compose.override.yml attempts:
Result Additional Context |
Beta Was this translation helpful? Give feedback.
Finally, I found the soultion by myself.
Troubleshooting Log: Connecting Local Ollama to LibreChat (v0.7.7) on Docker
Goal
Configure a Dockerized LibreChat (v0.7.7) instance to use models hosted on a local Ollama server.
Initial State
llama3:8b-instruct-q4_K_M
,mistral:7b
) downloaded.Troubleshooting Process
1. Issue: Ollama Endpoint Configuration Not Applied
.env
file to add Ollama settings or changing existing endpoint variables has no effect on the UI. Only defau…