Feature Request: Enable Ollama Provider in All Deployment Modes
Summary
The Ollama provider in the Connect providers dialog is currently restricted to
MANIFEST_MODE=local, showing "Only available on Manifest Local" for all other deployment modes.
This unnecessarily prevents Docker/Kubernetes users — and users of Ollama Cloud — from
connecting to Ollama without switching the entire application to local mode.
Motivation
Ollama is no longer exclusively a local/desktop tool. There are two distinct use cases that are
currently blocked:
- Self-hosted Ollama on a remote host — users running Manifest as a Docker container or in a
Kubernetes cluster want to point it at a remote Ollama instance (e.g.
http://ollama.internal:11434) without enabling local mode.
- Ollama Cloud — Ollama now offers a hosted cloud service with an API key, similar to OpenAI
or OpenRouter. Users should be able to add their Ollama Cloud API key through the same
"Connect providers" UI that already supports those providers.
In both cases, MANIFEST_MODE=local has broader implications (database dialect, URL validation
bypass, etc.) that are undesirable in production environments.
Current Behaviour
- The "Connect providers" dialog shows Ollama with the label "Only available on Manifest Local"
and no toggle, while every other provider (OpenAI, OpenRouter, Google, xAI, etc.) has an
enable/disable toggle.
- The Ollama host is resolved in
packages/backend/src/common/constants/ollama.ts as:
export const OLLAMA_HOST = process.env['OLLAMA_HOST'] || 'http://localhost:11434';
- There is no documented way to change this endpoint or supply an API key outside of local mode.
Expected Behaviour
- Ollama Cloud via API key — Ollama should appear in the "Connect providers → API Keys" tab
with a toggle and an API key field, like OpenAI and OpenRouter. Users with an Ollama Cloud
subscription can then enable it without any special deployment mode.
- Self-hosted via
OLLAMA_HOST — OLLAMA_HOST should be a documented, first-class
environment variable usable in any deployment mode to point Manifest at a remote Ollama
instance.
- The "Only available on Manifest Local" restriction should be lifted (or at minimum, relaxed to
allow the above two paths).
- Optionally, a startup log line confirming the resolved Ollama endpoint would help operators
verify the configuration.
Proposed Change
- Add an Ollama entry in the API Keys provider list with an optional API key field and an
OLLAMA_HOST / base URL field (defaulting to http://localhost:11434).
- Document
OLLAMA_HOST in the self-hosting / Docker guide as a supported environment variable.
- Decouple Ollama connectivity from
MANIFEST_MODE=local.
Environment
- Deployment: Docker / Kubernetes
- Image:
manifestdotbuild/manifest:latest
- Ollama: self-hosted on a separate host (
http://ollama.ai.svc.cluster.local:11434)
Additional Context
This was discovered while deploying Manifest as a standalone service in a Kubernetes cluster where
Ollama runs as a separate pod. The workaround of setting MANIFEST_MODE=local +
OLLAMA_HOST=<url> technically works but forces an unsuitable application mode.
Support for Ollama Cloud API keys would bring Ollama to parity with all other providers already
available in the "Connect providers" dialog.
Feature Request: Enable Ollama Provider in All Deployment Modes
Summary
The Ollama provider in the Connect providers dialog is currently restricted to
MANIFEST_MODE=local, showing "Only available on Manifest Local" for all other deployment modes.This unnecessarily prevents Docker/Kubernetes users — and users of Ollama Cloud — from
connecting to Ollama without switching the entire application to local mode.
Motivation
Ollama is no longer exclusively a local/desktop tool. There are two distinct use cases that are
currently blocked:
Kubernetes cluster want to point it at a remote Ollama instance (e.g.
http://ollama.internal:11434) without enabling local mode.or OpenRouter. Users should be able to add their Ollama Cloud API key through the same
"Connect providers" UI that already supports those providers.
In both cases,
MANIFEST_MODE=localhas broader implications (database dialect, URL validationbypass, etc.) that are undesirable in production environments.
Current Behaviour
and no toggle, while every other provider (OpenAI, OpenRouter, Google, xAI, etc.) has an
enable/disable toggle.
packages/backend/src/common/constants/ollama.tsas:Expected Behaviour
with a toggle and an API key field, like OpenAI and OpenRouter. Users with an Ollama Cloud
subscription can then enable it without any special deployment mode.
OLLAMA_HOST—OLLAMA_HOSTshould be a documented, first-classenvironment variable usable in any deployment mode to point Manifest at a remote Ollama
instance.
allow the above two paths).
verify the configuration.
Proposed Change
OLLAMA_HOST/ base URL field (defaulting tohttp://localhost:11434).OLLAMA_HOSTin the self-hosting / Docker guide as a supported environment variable.MANIFEST_MODE=local.Environment
manifestdotbuild/manifest:latesthttp://ollama.ai.svc.cluster.local:11434)Additional Context
This was discovered while deploying Manifest as a standalone service in a Kubernetes cluster where
Ollama runs as a separate pod. The workaround of setting
MANIFEST_MODE=local+OLLAMA_HOST=<url>technically works but forces an unsuitable application mode.Support for Ollama Cloud API keys would bring Ollama to parity with all other providers already
available in the "Connect providers" dialog.