Add multi-provider AI support (local servers, OpenRouter)#2
Open
JasonDoug wants to merge 3 commits intobenjaminshoemaker:mainfrom
Open
Add multi-provider AI support (local servers, OpenRouter)#2JasonDoug wants to merge 3 commits intobenjaminshoemaker:mainfrom
JasonDoug wants to merge 3 commits intobenjaminshoemaker:mainfrom
Conversation
Introduces a shared ai-provider utility with getModel() and getOptionsModel() so chat, generation, and options routes can use any OpenAI-compatible provider. - AI_PROVIDER selects the backend: openai (default), local, or openrouter - Local provider supports Ollama, LM Studio, and any /v1-compatible server - OPTIONS_PROVIDER lets options generation use a different provider than chat, since structured output (JSON schema) requires capable model support - Per-provider options model vars: LOCAL_OPTIONS_MODEL, OPENROUTER_OPTIONS_MODEL - .env.example fully documented with recommended models and examples - CLAUDE.md updated to reflect new provider architecture Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…ning - Extract buildLocalClient/buildOpenRouterClient private helpers to eliminate duplicated createOpenAI setup between getModel and getOptionsModel - Throw early with a clear message when OPENROUTER_API_KEY is missing instead of passing an empty string and deferring failure to the API call - Remove cross-provider OPENAI_MODEL fallback from LOCAL_OPTIONS_MODEL and OPENROUTER_OPTIONS_MODEL; each provider now falls back to its own default - Warn on unrecognized AI_PROVIDER / OPTIONS_PROVIDER values instead of silently falling back to OpenAI - Update CLAUDE.md mock pattern to target @/app/utils/ai-provider since routes no longer call @ai-sdk/openai directly Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Add multi-provider AI support (local, OpenRouter, OpenAI)
|
@JasonDoug is attempting to deploy a commit to the benshoemakerxyz-3472's projects Team on Vercel. A member of the Team first needs to authorize it. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds support for two additional AI providers alongside the existing OpenAI integration, with no breaking changes to existing configuration.
/v1endpoint) viaAI_PROVIDER=localAI_PROVIDER=openrouter, supporting hundreds of hosted models with a single API keyapp/utils/ai-provider.tsutility (getModel/getOptionsModel) replaces directopenai()calls in all three API routes, so future provider additions touch only one fileOptions generation (structured output)
The multiple-choice options feature uses
streamObject()with a JSON schema, which requires structured output support that not all models provide. A separateOPTIONS_PROVIDERenv var lets users route options generation to a different provider than chat — e.g. use a local model for chat while keeping OpenAI for reliable structured output.Per-provider options model vars:
OPENAI_OPTIONS_MODEL,LOCAL_OPTIONS_MODEL,OPENROUTER_OPTIONS_MODEL.Robustness
OPENROUTER_API_KEYis missing, rather than passing""to the SDK and failing at call timeAI_PROVIDER/OPTIONS_PROVIDERvalues (typo detection) instead of silently falling back to OpenAIFiles changed
app/utils/ai-provider.tsgetModelandgetOptionsModelapp/api/chat/route.tsgetModel()instead ofopenai()app/api/generate-doc/route.tsgetModel()instead ofopenai()app/api/generate-options/route.tsgetOptionsModel()instead ofopenai().env.exampleCLAUDE.mdConfiguration examples
Local (Ollama):
OpenRouter:
Mixed — local chat, OpenAI options:
Test plan
AI_PROVIDER=openai— existing behaviour unchanged, all tests passAI_PROVIDER=localwith Ollama — chat and document generation stream correctlyAI_PROVIDER=openrouter— chat and document generation stream correctlyOPENROUTER_API_KEYthrows a clear error at request timeOPTIONS_PROVIDER=openaiwhileAI_PROVIDER=local— options use OpenAI, chat uses localnpm testpasses🤖 Generated with Claude Code