Skip to content

Add multi-provider AI support (local servers, OpenRouter)#2

Open
JasonDoug wants to merge 3 commits intobenjaminshoemaker:mainfrom
JasonDoug:main
Open

Add multi-provider AI support (local servers, OpenRouter)#2
JasonDoug wants to merge 3 commits intobenjaminshoemaker:mainfrom
JasonDoug:main

Conversation

@JasonDoug
Copy link
Copy Markdown

Summary

Adds support for two additional AI providers alongside the existing OpenAI integration, with no breaking changes to existing configuration.

  • Local servers (Ollama, LM Studio, any OpenAI-compatible /v1 endpoint) via AI_PROVIDER=local
  • OpenRouter via AI_PROVIDER=openrouter, supporting hundreds of hosted models with a single API key
  • A shared app/utils/ai-provider.ts utility (getModel / getOptionsModel) replaces direct openai() calls in all three API routes, so future provider additions touch only one file

Options generation (structured output)

The multiple-choice options feature uses streamObject() with a JSON schema, which requires structured output support that not all models provide. A separate OPTIONS_PROVIDER env var lets users route options generation to a different provider than chat — e.g. use a local model for chat while keeping OpenAI for reliable structured output.

Per-provider options model vars: OPENAI_OPTIONS_MODEL, LOCAL_OPTIONS_MODEL, OPENROUTER_OPTIONS_MODEL.

Robustness

  • Throws immediately with a clear message when OPENROUTER_API_KEY is missing, rather than passing "" to the SDK and failing at call time
  • Warns on unrecognised AI_PROVIDER / OPTIONS_PROVIDER values (typo detection) instead of silently falling back to OpenAI
  • Each provider's options model falls back only to its own provider-appropriate default — no cross-provider model name leakage

Files changed

File Change
app/utils/ai-provider.ts New — shared provider factory with getModel and getOptionsModel
app/api/chat/route.ts Use getModel() instead of openai()
app/api/generate-doc/route.ts Use getModel() instead of openai()
app/api/generate-options/route.ts Use getOptionsModel() instead of openai()
.env.example Full provider documentation with recommended models and examples
CLAUDE.md Added (project guidance for Claude Code)

Configuration examples

Local (Ollama):

AI_PROVIDER=local
LOCAL_AI_BASE_URL=http://localhost:11434/v1
OPENAI_MODEL=llama3.2:latest

OpenRouter:

AI_PROVIDER=openrouter
OPENROUTER_API_KEY=sk-or-...
OPENAI_MODEL=anthropic/claude-3-5-sonnet

Mixed — local chat, OpenAI options:

AI_PROVIDER=local
OPTIONS_PROVIDER=openai
OPENAI_API_KEY=sk-...
LOCAL_AI_BASE_URL=http://localhost:11434/v1
OPENAI_MODEL=llama3.2:latest

Test plan

  • AI_PROVIDER=openai — existing behaviour unchanged, all tests pass
  • AI_PROVIDER=local with Ollama — chat and document generation stream correctly
  • AI_PROVIDER=openrouter — chat and document generation stream correctly
  • Missing OPENROUTER_API_KEY throws a clear error at request time
  • OPTIONS_PROVIDER=openai while AI_PROVIDER=local — options use OpenAI, chat uses local
  • Unrecognised provider value logs a warning and falls back to OpenAI
  • npm test passes

🤖 Generated with Claude Code

JasonDoug and others added 3 commits February 20, 2026 02:09
Introduces a shared ai-provider utility with getModel() and getOptionsModel()
so chat, generation, and options routes can use any OpenAI-compatible provider.

- AI_PROVIDER selects the backend: openai (default), local, or openrouter
- Local provider supports Ollama, LM Studio, and any /v1-compatible server
- OPTIONS_PROVIDER lets options generation use a different provider than chat,
  since structured output (JSON schema) requires capable model support
- Per-provider options model vars: LOCAL_OPTIONS_MODEL, OPENROUTER_OPTIONS_MODEL
- .env.example fully documented with recommended models and examples
- CLAUDE.md updated to reflect new provider architecture

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…ning

- Extract buildLocalClient/buildOpenRouterClient private helpers to eliminate
  duplicated createOpenAI setup between getModel and getOptionsModel
- Throw early with a clear message when OPENROUTER_API_KEY is missing instead
  of passing an empty string and deferring failure to the API call
- Remove cross-provider OPENAI_MODEL fallback from LOCAL_OPTIONS_MODEL and
  OPENROUTER_OPTIONS_MODEL; each provider now falls back to its own default
- Warn on unrecognized AI_PROVIDER / OPTIONS_PROVIDER values instead of
  silently falling back to OpenAI
- Update CLAUDE.md mock pattern to target @/app/utils/ai-provider since routes
  no longer call @ai-sdk/openai directly

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Add multi-provider AI support (local, OpenRouter, OpenAI)
@vercel
Copy link
Copy Markdown

vercel bot commented Feb 20, 2026

@JasonDoug is attempting to deploy a commit to the benshoemakerxyz-3472's projects Team on Vercel.

A member of the Team first needs to authorize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant