-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
fix: add OLLAMA_BASE_URL support to backend config #605
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Someone is attempting to deploy a commit to the Rohan Verma's projects Team on Vercel. A member of the Team first needs to authorize it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review by RecurseML
🔍 Review performed on 48ea41a..3772901
✨ No bugs found, your code is sparkling clean
✅ Files analyzed, no issues (2)
• surfsense_backend/.env.example
• surfsense_backend/app/config/__init__.py
|
@AnishSarkar22 Can you test if it fixes the original issue. IMO I think we should be storing this in db and passing this var on runtime. Can you try that. @shamil2 Can you raise this PR for 'dev' branch. |
|
Sure I will test it and confirm |
|
@MODSetter Tested this locally but the solution by @shamil2 does not fix the original issue. Chonkie doesn't have native Ollama support, so it crashes with Yes we need a DB-based solution similar to how LLM configs work. Should I implement an Embedding Configuration UI/table that stores the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixes #587. Adds support for OLLAMA_BASE_URL environment variable to allow connection to external Ollama instances for embeddings, which fixes connection refused errors during document and YouTube video uploads in Docker.
High-level PR Summary
This PR adds support for the
OLLAMA_BASE_URLenvironment variable to enable connections to external Ollama instances for embeddings. This resolves connection refused errors that occur during document and YouTube video uploads when running in Docker environments by allowing users to specify a custom Ollama endpoint (e.g.,http://host.docker.internal:11434).⏱️ Estimated Review Time: 5-15 minutes
💡 Review Order Suggestion
surfsense_backend/.env.examplesurfsense_backend/app/config/__init__.py