Skip to content

Feature Request: Enable Ollama Provider in All Deployment Modes #1488

@fabioteixei

Description

@fabioteixei

Feature Request: Enable Ollama Provider in All Deployment Modes

Summary

The Ollama provider in the Connect providers dialog is currently restricted to
MANIFEST_MODE=local, showing "Only available on Manifest Local" for all other deployment modes.
This unnecessarily prevents Docker/Kubernetes users — and users of Ollama Cloud — from
connecting to Ollama without switching the entire application to local mode.

Motivation

Ollama is no longer exclusively a local/desktop tool. There are two distinct use cases that are
currently blocked:

  1. Self-hosted Ollama on a remote host — users running Manifest as a Docker container or in a
    Kubernetes cluster want to point it at a remote Ollama instance (e.g.
    http://ollama.internal:11434) without enabling local mode.
  2. Ollama Cloud — Ollama now offers a hosted cloud service with an API key, similar to OpenAI
    or OpenRouter. Users should be able to add their Ollama Cloud API key through the same
    "Connect providers" UI that already supports those providers.

In both cases, MANIFEST_MODE=local has broader implications (database dialect, URL validation
bypass, etc.) that are undesirable in production environments.

Current Behaviour

  • The "Connect providers" dialog shows Ollama with the label "Only available on Manifest Local"
    and no toggle, while every other provider (OpenAI, OpenRouter, Google, xAI, etc.) has an
    enable/disable toggle.
  • The Ollama host is resolved in packages/backend/src/common/constants/ollama.ts as:
export const OLLAMA_HOST = process.env['OLLAMA_HOST'] || 'http://localhost:11434';
  • There is no documented way to change this endpoint or supply an API key outside of local mode.

Expected Behaviour

  1. Ollama Cloud via API key — Ollama should appear in the "Connect providers → API Keys" tab
    with a toggle and an API key field, like OpenAI and OpenRouter. Users with an Ollama Cloud
    subscription can then enable it without any special deployment mode.
  2. Self-hosted via OLLAMA_HOSTOLLAMA_HOST should be a documented, first-class
    environment variable usable in any deployment mode to point Manifest at a remote Ollama
    instance.
  3. The "Only available on Manifest Local" restriction should be lifted (or at minimum, relaxed to
    allow the above two paths).
  4. Optionally, a startup log line confirming the resolved Ollama endpoint would help operators
    verify the configuration.

Proposed Change

  • Add an Ollama entry in the API Keys provider list with an optional API key field and an
    OLLAMA_HOST / base URL field (defaulting to http://localhost:11434).
  • Document OLLAMA_HOST in the self-hosting / Docker guide as a supported environment variable.
  • Decouple Ollama connectivity from MANIFEST_MODE=local.

Environment

  • Deployment: Docker / Kubernetes
  • Image: manifestdotbuild/manifest:latest
  • Ollama: self-hosted on a separate host (http://ollama.ai.svc.cluster.local:11434)

Additional Context

This was discovered while deploying Manifest as a standalone service in a Kubernetes cluster where
Ollama runs as a separate pod. The workaround of setting MANIFEST_MODE=local +
OLLAMA_HOST=<url> technically works but forces an unsuitable application mode.
Support for Ollama Cloud API keys would bring Ollama to parity with all other providers already
available in the "Connect providers" dialog.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions