-
Notifications
You must be signed in to change notification settings - Fork 51
Closed
Description
Implement Ollama Provider
We need to add support for the Ollama API to the MyCoder LLM abstraction. Ollama is an open-source framework that allows running large language models locally.
Requirements:
- Create an Ollama provider implementation in
packages/agent/src/core/llm/providers/ollama.ts
- Support configurable endpoint URL (default to
http://localhost:11434
) - Support both chat and completion API endpoints
- Handle tool/function calling
- Register the provider in the LLM provider registry
Implementation Details:
- The provider should implement the
LLMProvider
interface - Support environment variable
OLLAMA_BASE_URL
for configuration - Follow similar patterns to existing providers like Anthropic
Metadata
Metadata
Assignees
Labels
No labels