Feature Request: Enable Ollama Provider in All Deployment Modes #1494
fabioteixei
started this conversation in
Feature request
Replies: 2 comments 1 reply
-
|
Hey @fabioteixei . Thank yo ufor this feature request. I moved it from Issues to Discussions/feature-requests as it's more approriate here. I also add a comment on this one asking for the same provider: #973 (comment) |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
How did you get "The workaround of setting MANIFEST_MODE=local + OLLAMA_HOST= technically works" ? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Feature Request: Enable Ollama Provider in All Deployment Modes
Summary
The Ollama provider in the Connect providers dialog is currently restricted to
MANIFEST_MODE=local, showing "Only available on Manifest Local" for all other deployment modes.This unnecessarily prevents Docker/Kubernetes users — and users of Ollama Cloud — from
connecting to Ollama without switching the entire application to local mode.
Motivation
Ollama is no longer exclusively a local/desktop tool. There are two distinct use cases that are
currently blocked:
Kubernetes cluster want to point it at a remote Ollama instance (e.g.
http://ollama.internal:11434) without enabling local mode.or OpenRouter. Users should be able to add their Ollama Cloud API key through the same
"Connect providers" UI that already supports those providers.
In both cases,
MANIFEST_MODE=localhas broader implications (database dialect, URL validationbypass, etc.) that are undesirable in production environments.
Current Behaviour
and no toggle, while every other provider (OpenAI, OpenRouter, Google, xAI, etc.) has an
enable/disable toggle.
packages/backend/src/common/constants/ollama.tsas:Expected Behaviour
with a toggle and an API key field, like OpenAI and OpenRouter. Users with an Ollama Cloud
subscription can then enable it without any special deployment mode.
OLLAMA_HOST—OLLAMA_HOSTshould be a documented, first-classenvironment variable usable in any deployment mode to point Manifest at a remote Ollama
instance.
allow the above two paths).
verify the configuration.
Proposed Change
OLLAMA_HOST/ base URL field (defaulting tohttp://localhost:11434).OLLAMA_HOSTin the self-hosting / Docker guide as a supported environment variable.MANIFEST_MODE=local.Environment
manifestdotbuild/manifest:latesthttp://ollama.ai.svc.cluster.local:11434)Additional Context
This was discovered while deploying Manifest as a standalone service in a Kubernetes cluster where
Ollama runs as a separate pod. The workaround of setting
MANIFEST_MODE=local+OLLAMA_HOST=<url>technically works but forces an unsuitable application mode.Support for Ollama Cloud API keys would bring Ollama to parity with all other providers already
available in the "Connect providers" dialog.
Beta Was this translation helpful? Give feedback.
All reactions