Skip to content

Commit 26af9ed

Browse files
committed
Add RubyLLM provider for unified multi-provider LLM access
Implements a new provider backed by the ruby_llm gem, giving ActiveAgent access to 15+ LLM providers (OpenAI, Anthropic, Gemini, Bedrock, Azure, Ollama, etc.) through a single config. Uses RubyLLM's provider-level API (provider.complete()) to avoid conflicts with ActiveAgent's own conversation management and tool execution loop. Includes support for prompts, streaming, tool calling, embeddings, and ToolChoiceClearing. 44 unit tests, 5 integration tests (excluded from default rake).
1 parent f763f68 commit 26af9ed

30 files changed

+2352
-10
lines changed

AGENTS.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -224,6 +224,12 @@ bin/test test/integration/open_ai/
224224
- Access 200+ models through single API
225225
- Provider preferences via `provider: { order: [...] }`
226226

227+
### RubyLLM
228+
- Uses `ruby_llm` gem for unified access to 15+ providers
229+
- RubyLLM manages its own API keys via `RubyLLM.configure`
230+
- Model ID determines which provider is used automatically
231+
- Supports prompts, embeddings, tool calling, and streaming
232+
227233
## Common Gotchas
228234

229235
1. **Generation is lazy** - Nothing happens until `generate_now` or `prompt_later`
@@ -253,7 +259,7 @@ bin/rubocop
253259

254260
- Ruby 3.1+
255261
- Rails 7.2+ / 8.0+ / 8.1+
256-
- Provider gems (optional): `openai`, `anthropic`
262+
- Provider gems (optional): `openai`, `anthropic`, `ruby_llm`
257263

258264
## Links
259265

README.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,9 @@ bundle add openai
3939

4040
# OpenRouter (uses OpenAI-compatible API)
4141
bundle add openai
42+
43+
# RubyLLM (unified API for 15+ providers)
44+
bundle add ruby_llm
4245
```
4346

4447
### Setup
@@ -119,12 +122,15 @@ development:
119122
ollama:
120123
service: "Ollama"
121124
model: "llama3.2"
125+
126+
ruby_llm:
127+
service: "RubyLLM"
122128
```
123129
124130
## Features
125131
126132
- **Agent-Oriented Programming**: Build AI applications using familiar Rails patterns
127-
- **Multiple Provider Support**: Works with OpenAI, Anthropic, Ollama, and more
133+
- **Multiple Provider Support**: Works with OpenAI, Anthropic, Ollama, RubyLLM, and more
128134
- **Action-Based Design**: Define agent capabilities through actions
129135
- **View Templates**: Use ERB templates for prompts (text, JSON, HTML)
130136
- **Streaming Support**: Real-time response streaming with ActionCable

Rakefile

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,9 @@ require "rake/testtask"
44

55
Rake::TestTask.new(:test) do |t|
66
t.libs << "test"
7-
t.pattern = "test/**/*_test.rb"
7+
t.test_files = FileList["test/**/*_test.rb"]
8+
.exclude("test/**/integration_test.rb")
9+
.exclude("test/dummy/tmp/**/*")
810
t.verbose = true
911
end
1012

activeagent.gemspec

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ Gem::Specification.new do |spec|
2929

3030
spec.add_development_dependency "anthropic", "~> 1.12"
3131
spec.add_development_dependency "openai", "~> 0.34"
32+
spec.add_development_dependency "ruby_llm", ">= 1.0"
3233

3334
spec.add_development_dependency "capybara", "~> 3.40"
3435
spec.add_development_dependency "cuprite", "~> 0.15"

docs/actions/embeddings.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -84,3 +84,4 @@ Different models produce different embedding dimensions:
8484
- [Testing](/framework/testing) - Test embedding functionality
8585
- [OpenAI Provider](/providers/open_ai) - OpenAI embedding models
8686
- [Ollama Provider](/providers/ollama) - Local embedding generation
87+
- [RubyLLM Provider](/providers/ruby_llm) - Embeddings via RubyLLM's unified API

docs/actions/mcps.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ Connect agents to external services via [Model Context Protocol](https://modelco
1818
| **Anthropic** | ⚠️ | Beta |
1919
| **OpenRouter** | 🚧 | In development |
2020
| **Ollama** || Not supported |
21+
| **RubyLLM** || Not supported (use provider-specific integration) |
2122
| **Mock** || Not supported |
2223

2324
## MCP Format

docs/actions/structured_output.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ Two JSON response formats:
2323
| **Anthropic** | 🟦 || Emulated via prompt engineering technique |
2424
| **OpenRouter** | 🟩 | 🟩 | Native support, depends on underlying model |
2525
| **Ollama** | 🟨 | 🟨 | Model-dependent, support varies by model |
26+
| **RubyLLM** | 🟨 | 🟨 | Depends on underlying provider/model |
2627
| **Mock** | 🟩 | 🟩 | Accepted but not validated or enforced |
2728

2829
## JSON Object Mode

docs/actions/tools.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ The LLM calls `get_weather` automatically when it needs weather data, and uses t
2323
| **Anthropic** | 🟩 | 🟩 | Full support for built-in tools |
2424
| **OpenRouter** | 🟩 || Model-dependent capabilities |
2525
| **Ollama** | 🟩 || Model-dependent capabilities |
26+
| **RubyLLM** | 🟩 || Depends on underlying provider/model |
2627
| **Mock** | 🟦 || Accepted but not enforced |
2728

2829
For **MCP (Model Context Protocol)** support, see the [MCP documentation](/actions/mcps).

docs/framework.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ Actions call `prompt()` or `embed()` to configure requests. Callbacks manage con
113113

114114
ActiveAgent integrates with Rails features and AI capabilities:
115115

116-
- **[Providers](/providers)** - Swap AI services (OpenAI, Anthropic, Ollama, OpenRouter)
116+
- **[Providers](/providers)** - Swap AI services (OpenAI, Anthropic, Ollama, OpenRouter, RubyLLM)
117117
- **[Instructions](/agents/instructions)** - System prompts from templates or strings
118118
- **[Callbacks](/agents/callbacks)** - Lifecycle hooks for context and logging
119119
- **[Tools](/actions/tools)** - Agent methods as AI-callable functions
@@ -133,7 +133,7 @@ ActiveAgent integrates with Rails features and AI capabilities:
133133
- [Generation](/agents/generation) - Synchronous and asynchronous execution
134134
- [Instructions](/agents/instructions) - System prompts and behavior guidance
135135
- [Messages](/actions/messages) - Conversation context with multimodal support
136-
- [Providers](/providers) - OpenAI, Anthropic, Ollama, OpenRouter configuration
136+
- [Providers](/providers) - OpenAI, Anthropic, Ollama, OpenRouter, RubyLLM configuration
137137

138138
**Advanced:**
139139
- [Tools](/actions/tools) - AI-callable Ruby methods and MCP integration

docs/framework/configuration.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,7 @@ Common settings available across all providers:
206206
207207
| Setting | Type | Required | Description |
208208
|---------|------|----------|-------------|
209-
| `service` | String | Yes | Provider class name (OpenAI, Anthropic, OpenRouter, Ollama, Mock) |
209+
| `service` | String | Yes | Provider class name (OpenAI, Anthropic, OpenRouter, Ollama, RubyLLM, Mock) |
210210
| `access_token` / `api_key` | String | Yes* | API authentication key |
211211
| `model` | String | Yes* | Model identifier for the LLM to use |
212212
| `temperature` | Float | No | Randomness control (0.0-2.0, default varies by provider) |
@@ -220,6 +220,7 @@ Common settings available across all providers:
220220
- **[Ollama Provider](/providers/ollama)** - Host configuration for local instances
221221
- **[OpenAI Provider](/providers/open_ai)** - Organization ID, request timeout, admin token, etc.
222222
- **[OpenRouter Provider](/providers/open_router)** - App name, site URL, provider preferences, etc.
223+
- **[RubyLLM Provider](/providers/ruby_llm)** - Unified API for 15+ providers via RubyLLM
223224
- **[Mock Provider](/providers/mock)** - Testing-specific options
224225

225226
### Using Configured Providers

0 commit comments

Comments
 (0)