Skip to content

Add Ollama support to the LLM abstractionΒ #147

@bhouston

Description

@bhouston

Add Ollama support to the LLM abstraction

After implementing the new LLM abstraction with Anthropic support, we need to add support for Ollama models.

Tasks

  1. Research the best way to interact with Ollama directly (without Vercel AI SDK)
  2. Implement Ollama adapter for our LLM abstraction
  3. Convert between our message format and Ollama's format
  4. Handle tool calls and tool results properly
  5. Update configuration to support Ollama models
  6. Add tests for Ollama integration

Benefits

  • Restore support for Ollama models that was removed when reverting Vercel AI SDK
  • Enable local model usage for users who prefer not to use cloud APIs
  • Implement a cleaner, more reliable integration than the previous version

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions