Skip to content

Using Ollama

Griffen Fargo edited this page Aug 12, 2025 · 5 revisions

Simple Setup

The easiest way to use Ollama out of the box with coco is to add or update the service option in the coco config to be ollama.

Full Customization

If you want to go deeper than shorthand aliases of openai and ollama and tinker with the underlying model, temperature, endpoint, or other settings available in the OpenAILLMService or OllamaLLMService interfaces.

Example of Service Object

{
  "defaultBranch": "main",
  "mode": "interactive",
  "service": {
    "provider": "ollama",
    "model": "qwen2.5-coder:latest",
    "endpoint": "http://localhost:11434",
    "maxConcurrent": 1,
    "tokenLimit": 2024,
    "temperature": 0.4,
    "maxParsingAttempts": 3,
    "authentication": {
      "type": "None"
    }
  }
}

Clone this wiki locally