feat: Add LLM API (llmapi.ai) as model provider#3519
feat: Add LLM API (llmapi.ai) as model provider#3519aleksieienko wants to merge 1 commit intochatboxai:mainfrom
Conversation
LLM API is an OpenAI-compatible gateway providing unified access to 120+ models across OpenAI, Anthropic, Google, xAI, Alibaba, Moonshot, and more. Adds provider enum, model class, provider definition, and test entry. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
WalkthroughThe changes add support for a new LLM API provider to the system by introducing a provider definition, a model class extending OpenAICompatible, registering the provider in the module index, adding an enum entry, and including test configuration. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/shared/providers/definitions/llmapi.ts`:
- Around line 12-14: defaultSettings.apiHost and the LLMApi class are
inconsistent and user overrides are ignored: update createModel to pass
config.formattedApiHost into the LLMApi constructor (or adjust the LLMApi
constructor to accept apiHost), and align defaultSettings.apiHost with the class
(include /v1) or remove it if the endpoint must be locked; specifically, modify
createModel to forward config.formattedApiHost to new LLMApi(...) and update the
LLMApi constructor signature (or its internal apiHost usage) so apiHost is not
omitted, ensuring user-provided apiHost is honored.
| defaultSettings: { | ||
| apiHost: 'https://api.llmapi.ai', | ||
| models: [], |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# First, let's look at the llmapi provider definition file
echo "=== llmapi.ts provider definition ==="
cat -n src/shared/providers/definitions/llmapi.ts | head -30Repository: chatboxai/chatbox
Length of output: 1245
🏁 Script executed:
#!/bin/bash
# Find the LLMApi model class to check hardcoded apiHost
echo "=== Searching for LLMApi model class ==="
fd -e ts -path "*/llmapi*" | grep -v definitions | grep -v node_modulesRepository: chatboxai/chatbox
Length of output: 287
🏁 Script executed:
#!/bin/bash
# Check how other providers handle apiHost in createModel
echo "=== Checking other provider definitions for apiHost/formattedApiHost usage ==="
for f in $(fd -e ts . src/shared/providers/definitions/ --max-depth 1 | sort); do
filename=$(basename "$f")
if [ "$filename" != "llmapi.ts" ]; then
echo "=== $filename ==="
rg -n 'formattedApiHost|apiHost|createModel' "$f" | head -20
fi
doneRepository: chatboxai/chatbox
Length of output: 2015
🏁 Script executed:
#!/bin/bash
# Search for LLMApi model files
echo "=== Finding LLMApi model file ==="
find src -name "*llmapi*" -type f | grep -v node_modulesRepository: chatboxai/chatbox
Length of output: 187
🏁 Script executed:
#!/bin/bash
# Check the LLMApi model class constructor and apiHost
echo "=== LLMApi model class ==="
cat -n src/shared/providers/definitions/models/llmapi.ts | head -50Repository: chatboxai/chatbox
Length of output: 1092
🏁 Script executed:
#!/bin/bash
# Verify what formattedApiHost is and how it's used by checking index or types
echo "=== Check how formattedApiHost is created/defined ==="
rg -n 'formattedApiHost' src/shared/providers --max-count 5 -B 2 -A 2Repository: chatboxai/chatbox
Length of output: 6403
defaultSettings.apiHost is inconsistent with the hardcoded apiHost in the model class, and user overrides are silently ignored.
The provider definition sets apiHost: 'https://api.llmapi.ai' (no /v1), but the LLMApi model class hardcodes apiHost to 'https://api.llmapi.ai/v1'. More importantly, createModel does not pass config.formattedApiHost to LLMApi—unlike every other provider (openai, claude, lmstudio, ollama, gemini, etc.). The constructor explicitly omits apiHost from its options, so any user customization is impossible.
To align with other providers and allow user configuration, either:
- Pass
config.formattedApiHostto the model and update the constructor to accept it, or - Remove
apiHostfromdefaultSettingsif the endpoint is intentionally locked, or - Update
defaultSettings.apiHostto include/v1if it should match the hardcoded value.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/shared/providers/definitions/llmapi.ts` around lines 12 - 14,
defaultSettings.apiHost and the LLMApi class are inconsistent and user overrides
are ignored: update createModel to pass config.formattedApiHost into the LLMApi
constructor (or adjust the LLMApi constructor to accept apiHost), and align
defaultSettings.apiHost with the class (include /v1) or remove it if the
endpoint must be locked; specifically, modify createModel to forward
config.formattedApiHost to new LLMApi(...) and update the LLMApi constructor
signature (or its internal apiHost usage) so apiHost is not omitted, ensuring
user-provided apiHost is honored.
Summary
https://api.llmapi.ai/v1Changes
LLMApi = 'llmapi'toModelProviderEnuminsrc/shared/types/provider.tssrc/shared/providers/definitions/models/llmapi.ts(extends OpenAICompatible)src/shared/providers/definitions/llmapi.tssrc/shared/providers/index.tstest/integration/model-provider/model-provider.test.tsTest plan
npm run test:model-providerwithTEST_LLMAPI_API_KEYset🤖 Generated with Claude Code
Summary by CodeRabbit