This was a good start and prototype to get the concept started quickly. This library has now been superceded by a rust based library available at https://github.com/lexlapax/rs-llmspell. Going forward, this library will not be actively worked on unless someone wants to add a quick fix if it's usefule to them.
Build powerful AI applications with a clean, unified interface to multiple LLM providers. Go-LLMs provides everything you need: from simple text generation to complex multi-agent workflows with built-in tools and structured outputs.
- One Library, All Providers - Switch between OpenAI, Anthropic, Google, Ollama, and more with the same code
- Production Readiness Mindset - Built-in error handling, rate limiting, retries, and monitoring
- Rich Tooling - 33+ built-in tools for web, files, calculations, and data processing
- Smart Agents - Create conversational AI that can use tools and coordinate with other agents
- Structured Data - Get reliable, validated JSON/XML output instead of unpredictable text
- Go Native - Designed for Go developers with clean APIs and minimal dependencies
β¨ Unified Provider API - OpenAI, Anthropic, Google Gemini, Vertex AI, Ollama, OpenRouter
π οΈ Built-in Tool System - Web search, file operations, calculations, APIs, and more
π€ Intelligent Agents - Conversational AI with memory, tools, and workflow orchestration
π Structured Outputs - JSON schema validation with automatic error recovery
π Tool Discovery - Dynamic tool exploration perfect for scripting engines
π Multimodal Content - Text, images, audio, video, and file support
β‘ Performance Optimized - Concurrent execution, streaming, caching
go get github.com/lexlapax/go-llmsGet started in 5 minutes with our interactive quickstart guide.
package main
import (
"context"
"fmt"
"os"
"github.com/lexlapax/go-llms/pkg/llm/provider"
"github.com/lexlapax/go-llms/pkg/agent/core"
"github.com/lexlapax/go-llms/pkg/llm/domain"
)
func main() {
// Create provider
p := provider.NewOpenAIProvider(os.Getenv("OPENAI_API_KEY"), "gpt-4")
// Create agent
agent := core.NewLLMAgent("assistant", "gpt-4", core.LLMDeps{Provider: p})
agent.SetSystemPrompt("You are a helpful assistant.")
// Chat
state := domain.NewState()
state.Set("user_input", "Explain quantum computing in simple terms")
result, _ := agent.Run(context.Background(), state)
fmt.Println(result.Get("response"))
}// Create smart agent with built-in tools
agent := core.NewLLMAgent("smart-assistant", "gpt-4", core.LLMDeps{Provider: p})
// Add powerful tools
agent.AddTool(web.NewWebSearchTool(webAPIKey))
agent.AddTool(file.NewFileReadTool())
agent.AddTool(calculator.NewCalculatorTool())
// Agent can now search web, read files, and calculate
state := domain.NewState()
state.Set("user_input", "Search for recent Go releases and calculate days since Go 1.21")
result, _ := agent.Run(context.Background(), state)// Define what you want
schema := &schema.Schema{
Type: "object",
Properties: map[string]*schema.Schema{
"name": {Type: "string"},
"sentiment": {Type: "string", Enum: []interface{}{"positive", "negative", "neutral"}},
"confidence": {Type: "number"},
},
Required: []string{"name", "sentiment"},
}
// Get reliable structured output
agent.SetSchema(schema)
state.Set("user_input", "Analyze this review: 'Amazing product, works perfectly!'")
result, _ := agent.Run(context.Background(), state)
// Guaranteed to match your schema
data := result.Get("structured_output")// Create specialized agents
extractor := core.NewLLMAgent("extractor", "gpt-4", core.LLMDeps{Provider: p})
analyzer := core.NewLLMAgent("analyzer", "claude-3-sonnet-20240229", core.LLMDeps{Provider: claude})
// Orchestrate with workflows
workflow := workflow.NewSequentialAgent("data-pipeline", []domain.BaseAgent{
extractor, // First: extract data from text
analyzer, // Second: analyze extracted data
})
// Process data through the pipeline
state.Set("document", "Large document content...")
result, _ := workflow.Run(context.Background(), state)- Complete Documentation Hub - Start here for everything
- 5-Minute Quickstart - Get running immediately
- User Guide - Task-oriented guides with 5 learning paths
- Technical Documentation - Architecture and implementation details
- Beginner Path - 5 simple projects to get started
- Developer Path - Build production applications
- Architect Path - Design robust systems
- Production Path - Deploy and scale
- 80+ Working Examples - Provider, agent, tool, and workflow examples
- Built-in Tools Reference - Complete guide to 33+ built-in tools
- Tool Usage Examples - Practical patterns and integration examples
- Chat Applications Guide - Build conversational AI
- Data Extraction Guide - Reliable data processing
- Agent Communication - Multi-agent coordination
OpenAI β’ Anthropic β’ Google Gemini β’ Google Vertex AI β’ Ollama β’ OpenRouter
| Provider | Best For | Models | Setup |
|---|---|---|---|
| OpenAI | General use, reliability | GPT-4o, GPT-4 Turbo, GPT-4o-mini | Guide |
| Anthropic | Analysis, reasoning | Claude 3.5 Sonnet, Claude 3 Opus | Guide |
| Google Gemini | Multimodal, speed | Gemini 2.0 Flash Lite, Gemini Pro | Guide ** |
| Vertex AI ** | Enterprise, compliance | Gemini + partner models | Guide |
| Ollama | Local hosting, privacy | Llama, Mistral, CodeLlama | Guide |
| OpenRouter | Model variety, cost | 400+ models (68 free) | Guide |
** Untested integration
See our provider comparison guide for detailed feature matrices.
Comprehensive documentation and quality assurance improvements:
- Complete Godoc Coverage - Enhanced 300+ Go files with comprehensive documentation
- Documentation Automation - Automated validation, link fixing, and content consistency
- Structural Reorganization - 95+ documentation files with proper navigation and learning paths
- Enhanced User Experience - Complete user guide with visual diagrams and task-oriented guides
- API Consistency - Updated 750+ code examples to current v0.3.5+ standards
- Technical Reference - Advanced tool patterns, integration guides, and architectural documentation
Comprehensive bridge integration for go-llmspell and other scripting engines:
- Schema Repositories with versioning and persistence
- Enhanced Error Handling with serializable errors and recovery strategies
- Event System with serialization, filtering, and replay capabilities
- Tool Discovery with metadata-first exploration (33+ built-in tools)
- Workflow Serialization with templates and script-based execution
- Testing Infrastructure with mocks, fixtures, and comprehensive scenarios
Full release history in CHANGELOG.md.
Go-LLMs uses a clean, modular architecture designed for reliability and extensibility:
pkg/
βββ llm/ # Provider implementations and domain types
βββ agent/ # Intelligent agents, tools, workflows, events
βββ schema/ # JSON Schema validation and type conversion
βββ structured/ # Output parsing with error recovery
βββ errors/ # Serializable error system with recovery
βββ testutils/ # Comprehensive testing infrastructure
Design Principles:
- Unified Interfaces - Same API across all providers and components
- Fail-Safe Defaults - Graceful degradation and automatic error recovery
- Type Safety - Strong typing with schema validation throughout
- Performance First - Concurrent execution, streaming, and efficient state management
- Bridge Friendly - JSON-serializable types for scripting engine integration
# Install
go get github.com/lexlapax/go-llms
# Try the quickstart
export OPENAI_API_KEY="your-key-here"
go run docs/user-guide/getting-started/quickstart.goChoose your path:
- New to AI? β 5-Minute Quickstart
- Build apps? β Chat Application Guide
- Production use? β Enterprise Deployment
- Contributing? β Technical Documentation
- GitHub Issues - Bug reports and feature requests
- Discussions - Questions and community
- Contributing Guide - Development and contribution guidelines
- Documentation Style Guide - Standards for code documentation
- Changelog - Complete version history
β
Actively Maintained - Regular updates and improvements
β
Comprehensive Testing - 280+ tests with >85% coverage
β
Complete Documentation - User guides, API docs, examples
β
Bridge Compatible - Ready for scripting engine integration
License: MIT - see LICENSE for details