Skip to content

Conversation

bhouston
Copy link
Member

Implement Ollama Provider

This PR adds support for the Ollama API to the MyCoder LLM abstraction. Ollama is an open-source framework that allows running large language models locally.

Implementation Details:

  • Created an Ollama provider implementation in packages/agent/src/core/llm/providers/ollama.ts
  • Supports configurable endpoint URL (default to http://localhost:11434)
  • Supports the chat API endpoint with proper message formatting
  • Handles tool/function calling
  • Registered the provider in the LLM provider registry

Configuration:

  • Default endpoint: http://localhost:11434
  • Can be configured via:
    • baseUrl option when creating the provider
    • OLLAMA_BASE_URL environment variable

Usage Example:

import { createProvider } from './core/llm/provider.js';

const provider = createProvider('ollama', 'llama3.2', {
  baseUrl: 'http://localhost:11434',
});

const response = await provider.generateText({
  messages: [
    { role: 'user', content: 'Hello, how are you?' }
  ],
  temperature: 0.7,
});

Fixes #209

@bhouston bhouston closed this Mar 12, 2025
@bhouston bhouston deleted the feature/ollama-provider branch March 12, 2025 15:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Implement Ollama provider for LLM abstraction

1 participant