Skip to content

Implement Ollama provider for LLM abstraction #209

@bhouston

Description

@bhouston

Implement Ollama Provider

We need to add support for the Ollama API to the MyCoder LLM abstraction. Ollama is an open-source framework that allows running large language models locally.

Requirements:

  • Create an Ollama provider implementation in packages/agent/src/core/llm/providers/ollama.ts
  • Support configurable endpoint URL (default to http://localhost:11434)
  • Support both chat and completion API endpoints
  • Handle tool/function calling
  • Register the provider in the LLM provider registry

Implementation Details:

  • The provider should implement the LLMProvider interface
  • Support environment variable OLLAMA_BASE_URL for configuration
  • Follow similar patterns to existing providers like Anthropic

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions