Skip to content

Conversation

bhouston
Copy link
Member

Add Ollama Support to LLM Abstraction

This PR implements Ollama support for the MyCoder LLM abstraction, addressing issue #147.

Changes

  • Created a new Ollama provider implementation in packages/agent/src/core/llm/providers/ollama.ts
  • Registered the provider in the LLM abstraction
  • Added configuration options for Ollama:
    • Added ollamaBaseUrl to CLI config (default: http://localhost:11434)
    • Added CLI option for specifying Ollama endpoint
    • Updated the provider selection to include Ollama
  • Updated the toolAgent config to support Ollama models
  • Fixed type errors and added tests

Testing

  • Added tests for Ollama provider
  • Ensured existing tests pass
  • Manually tested with local Ollama server

Usage

Users can now use Ollama models with MyCoder:

# Set Ollama as the provider
mycoder config set provider ollama

# Set the model name
mycoder config set model llama3

# Optionally set a custom Ollama endpoint
mycoder config set ollamaBaseUrl http://custom-ollama-server:11434

# Or specify the endpoint via CLI
mycoder --provider ollama --model llama3 --ollamaBaseUrl http://custom-ollama:11434 "Your prompt"

Closes #147

@bhouston bhouston merged commit 999a649 into main Mar 12, 2025
1 check passed
Copy link

🎉 This PR is included in version mycoder-agent-v1.1.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Copy link

🎉 This PR is included in version mycoder-v1.1.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Copy link

sentry-io bot commented Mar 13, 2025

Suspect Issues

This pull request was deployed and Sentry observed the following issues:

  • ‼️ ResponseError: json: cannot unmarshal string into Go struct field ChatRequest.messages.tool_calls.function.arguments of type api.ToolCallFunctionArguments OllamaProvider.generateText(ollama.ts) View Issue

Did you find this useful? React with a 👍 or 👎

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add Ollama support to the LLM abstraction

1 participant