Skip to content

Comments

feat: Add LLM API (llmapi.ai) as model provider#3519

Open
aleksieienko wants to merge 1 commit intochatboxai:mainfrom
aleksieienko:feat/llmapi-provider
Open

feat: Add LLM API (llmapi.ai) as model provider#3519
aleksieienko wants to merge 1 commit intochatboxai:mainfrom
aleksieienko:feat/llmapi-provider

Conversation

@aleksieienko
Copy link

@aleksieienko aleksieienko commented Feb 21, 2026

Summary

  • Adds LLM API (llmapi.ai) as a native model provider
  • LLM API is a unified API gateway providing access to 120+ models across OpenAI, Anthropic, Google, xAI, Alibaba, Moonshot, and more
  • Uses OpenAI-compatible protocol at https://api.llmapi.ai/v1

Changes

  • Added LLMApi = 'llmapi' to ModelProviderEnum in src/shared/types/provider.ts
  • Created model class src/shared/providers/definitions/models/llmapi.ts (extends OpenAICompatible)
  • Created provider definition src/shared/providers/definitions/llmapi.ts
  • Added import in src/shared/providers/index.ts
  • Added test entry in test/integration/model-provider/model-provider.test.ts

Test plan

  • Verify LLM API appears in the provider selection list
  • Configure an API key and verify model listing works
  • Send a chat message and verify response streaming works
  • Run npm run test:model-provider with TEST_LLMAPI_API_KEY set

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features
    • LLM API is now available as a model provider option with OpenAI-compatible configuration support.
    • Users can configure LLM API with API key, temperature, top-p, max output tokens, and streaming settings.

LLM API is an OpenAI-compatible gateway providing unified access to 120+
models across OpenAI, Anthropic, Google, xAI, Alibaba, Moonshot, and more.
Adds provider enum, model class, provider definition, and test entry.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 21, 2026

Walkthrough

The changes add support for a new LLM API provider to the system by introducing a provider definition, a model class extending OpenAICompatible, registering the provider in the module index, adding an enum entry, and including test configuration.

Changes

Cohort / File(s) Summary
Provider Definition and Model Implementation
src/shared/providers/definitions/llmapi.ts, src/shared/providers/definitions/models/llmapi.ts
New provider definition for LLM API with OpenAI-compatible type, default API host configuration, and model creation logic. New LLMApi class extending OpenAICompatible that maps configuration parameters and stores composed options.
Provider Registry and Type Updates
src/shared/providers/index.ts, src/shared/types/provider.ts
Registers llmapi provider module import and adds LLMApi enum member to ModelProviderEnum for type recognition.
Test Configuration
test/integration/model-provider/model-provider.test.ts
Adds LLMApi provider entry to test matrix with empty model array for test coverage.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 A new provider hops in with flair,
LLM API through the air!
OpenAI compatible, clean and bright,
Configuration set just right,
From enum to test, the pattern's complete! 🌟

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately and clearly describes the main change: adding LLM API as a new model provider to the system.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@aleksieienko
Copy link
Author

@coderabbitai review

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 21, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/shared/providers/definitions/llmapi.ts`:
- Around line 12-14: defaultSettings.apiHost and the LLMApi class are
inconsistent and user overrides are ignored: update createModel to pass
config.formattedApiHost into the LLMApi constructor (or adjust the LLMApi
constructor to accept apiHost), and align defaultSettings.apiHost with the class
(include /v1) or remove it if the endpoint must be locked; specifically, modify
createModel to forward config.formattedApiHost to new LLMApi(...) and update the
LLMApi constructor signature (or its internal apiHost usage) so apiHost is not
omitted, ensuring user-provided apiHost is honored.

Comment on lines +12 to +14
defaultSettings: {
apiHost: 'https://api.llmapi.ai',
models: [],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# First, let's look at the llmapi provider definition file
echo "=== llmapi.ts provider definition ==="
cat -n src/shared/providers/definitions/llmapi.ts | head -30

Repository: chatboxai/chatbox

Length of output: 1245


🏁 Script executed:

#!/bin/bash
# Find the LLMApi model class to check hardcoded apiHost
echo "=== Searching for LLMApi model class ==="
fd -e ts -path "*/llmapi*" | grep -v definitions | grep -v node_modules

Repository: chatboxai/chatbox

Length of output: 287


🏁 Script executed:

#!/bin/bash
# Check how other providers handle apiHost in createModel
echo "=== Checking other provider definitions for apiHost/formattedApiHost usage ==="
for f in $(fd -e ts . src/shared/providers/definitions/ --max-depth 1 | sort); do
  filename=$(basename "$f")
  if [ "$filename" != "llmapi.ts" ]; then
    echo "=== $filename ==="
    rg -n 'formattedApiHost|apiHost|createModel' "$f" | head -20
  fi
done

Repository: chatboxai/chatbox

Length of output: 2015


🏁 Script executed:

#!/bin/bash
# Search for LLMApi model files
echo "=== Finding LLMApi model file ==="
find src -name "*llmapi*" -type f | grep -v node_modules

Repository: chatboxai/chatbox

Length of output: 187


🏁 Script executed:

#!/bin/bash
# Check the LLMApi model class constructor and apiHost
echo "=== LLMApi model class ==="
cat -n src/shared/providers/definitions/models/llmapi.ts | head -50

Repository: chatboxai/chatbox

Length of output: 1092


🏁 Script executed:

#!/bin/bash
# Verify what formattedApiHost is and how it's used by checking index or types
echo "=== Check how formattedApiHost is created/defined ==="
rg -n 'formattedApiHost' src/shared/providers --max-count 5 -B 2 -A 2

Repository: chatboxai/chatbox

Length of output: 6403


defaultSettings.apiHost is inconsistent with the hardcoded apiHost in the model class, and user overrides are silently ignored.

The provider definition sets apiHost: 'https://api.llmapi.ai' (no /v1), but the LLMApi model class hardcodes apiHost to 'https://api.llmapi.ai/v1'. More importantly, createModel does not pass config.formattedApiHost to LLMApi—unlike every other provider (openai, claude, lmstudio, ollama, gemini, etc.). The constructor explicitly omits apiHost from its options, so any user customization is impossible.

To align with other providers and allow user configuration, either:

  • Pass config.formattedApiHost to the model and update the constructor to accept it, or
  • Remove apiHost from defaultSettings if the endpoint is intentionally locked, or
  • Update defaultSettings.apiHost to include /v1 if it should match the hardcoded value.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/shared/providers/definitions/llmapi.ts` around lines 12 - 14,
defaultSettings.apiHost and the LLMApi class are inconsistent and user overrides
are ignored: update createModel to pass config.formattedApiHost into the LLMApi
constructor (or adjust the LLMApi constructor to accept apiHost), and align
defaultSettings.apiHost with the class (include /v1) or remove it if the
endpoint must be locked; specifically, modify createModel to forward
config.formattedApiHost to new LLMApi(...) and update the LLMApi constructor
signature (or its internal apiHost usage) so apiHost is not omitted, ensuring
user-provided apiHost is honored.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant