Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions src/shared/providers/definitions/llmapi.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
import { ModelProviderEnum, ModelProviderType } from '../../types'
import { defineProvider } from '../registry'
import LLMApi from './models/llmapi'

export const llmapiProvider = defineProvider({
id: ModelProviderEnum.LLMApi,
name: 'LLM API',
type: ModelProviderType.OpenAI,
urls: {
website: 'https://llmapi.ai/',
},
defaultSettings: {
apiHost: 'https://api.llmapi.ai',
models: [],
Comment on lines +12 to +14
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# First, let's look at the llmapi provider definition file
echo "=== llmapi.ts provider definition ==="
cat -n src/shared/providers/definitions/llmapi.ts | head -30

Repository: chatboxai/chatbox

Length of output: 1245


🏁 Script executed:

#!/bin/bash
# Find the LLMApi model class to check hardcoded apiHost
echo "=== Searching for LLMApi model class ==="
fd -e ts -path "*/llmapi*" | grep -v definitions | grep -v node_modules

Repository: chatboxai/chatbox

Length of output: 287


🏁 Script executed:

#!/bin/bash
# Check how other providers handle apiHost in createModel
echo "=== Checking other provider definitions for apiHost/formattedApiHost usage ==="
for f in $(fd -e ts . src/shared/providers/definitions/ --max-depth 1 | sort); do
  filename=$(basename "$f")
  if [ "$filename" != "llmapi.ts" ]; then
    echo "=== $filename ==="
    rg -n 'formattedApiHost|apiHost|createModel' "$f" | head -20
  fi
done

Repository: chatboxai/chatbox

Length of output: 2015


🏁 Script executed:

#!/bin/bash
# Search for LLMApi model files
echo "=== Finding LLMApi model file ==="
find src -name "*llmapi*" -type f | grep -v node_modules

Repository: chatboxai/chatbox

Length of output: 187


🏁 Script executed:

#!/bin/bash
# Check the LLMApi model class constructor and apiHost
echo "=== LLMApi model class ==="
cat -n src/shared/providers/definitions/models/llmapi.ts | head -50

Repository: chatboxai/chatbox

Length of output: 1092


🏁 Script executed:

#!/bin/bash
# Verify what formattedApiHost is and how it's used by checking index or types
echo "=== Check how formattedApiHost is created/defined ==="
rg -n 'formattedApiHost' src/shared/providers --max-count 5 -B 2 -A 2

Repository: chatboxai/chatbox

Length of output: 6403


defaultSettings.apiHost is inconsistent with the hardcoded apiHost in the model class, and user overrides are silently ignored.

The provider definition sets apiHost: 'https://api.llmapi.ai' (no /v1), but the LLMApi model class hardcodes apiHost to 'https://api.llmapi.ai/v1'. More importantly, createModel does not pass config.formattedApiHost to LLMApi—unlike every other provider (openai, claude, lmstudio, ollama, gemini, etc.). The constructor explicitly omits apiHost from its options, so any user customization is impossible.

To align with other providers and allow user configuration, either:

  • Pass config.formattedApiHost to the model and update the constructor to accept it, or
  • Remove apiHost from defaultSettings if the endpoint is intentionally locked, or
  • Update defaultSettings.apiHost to include /v1 if it should match the hardcoded value.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/shared/providers/definitions/llmapi.ts` around lines 12 - 14,
defaultSettings.apiHost and the LLMApi class are inconsistent and user overrides
are ignored: update createModel to pass config.formattedApiHost into the LLMApi
constructor (or adjust the LLMApi constructor to accept apiHost), and align
defaultSettings.apiHost with the class (include /v1) or remove it if the
endpoint must be locked; specifically, modify createModel to forward
config.formattedApiHost to new LLMApi(...) and update the LLMApi constructor
signature (or its internal apiHost usage) so apiHost is not omitted, ensuring
user-provided apiHost is honored.

},
createModel: (config) => {
return new LLMApi(
{
apiKey: config.providerSetting.apiKey || '',
model: config.model,
temperature: config.settings.temperature,
topP: config.settings.topP,
maxOutputTokens: config.settings.maxTokens,
stream: config.settings.stream,
},
config.dependencies
)
},
getDisplayName: (modelId, providerSettings) => {
return `LLM API (${providerSettings?.models?.find((m) => m.modelId === modelId)?.nickname || modelId})`
},
})
28 changes: 28 additions & 0 deletions src/shared/providers/definitions/models/llmapi.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
import OpenAICompatible, { type OpenAICompatibleSettings } from '../../../models/openai-compatible'
import type { ModelDependencies } from '../../../types/adapters'

interface Options extends OpenAICompatibleSettings {}

export default class LLMApi extends OpenAICompatible {
public name = 'LLM API'
public options: Options
constructor(options: Omit<Options, 'apiHost'>, dependencies: ModelDependencies) {
const apiHost = 'https://api.llmapi.ai/v1'
super(
{
apiKey: options.apiKey,
apiHost,
model: options.model,
temperature: options.temperature,
topP: options.topP,
maxOutputTokens: options.maxOutputTokens,
stream: options.stream,
},
dependencies
)
this.options = {
...options,
apiHost,
}
}
}
1 change: 1 addition & 0 deletions src/shared/providers/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ import './definitions/lmstudio'
import './definitions/azure'
import './definitions/groq'
import './definitions/xai'
import './definitions/llmapi'
import './definitions/mistral-ai'
import './definitions/perplexity'
import './definitions/volcengine'
Expand Down
1 change: 1 addition & 0 deletions src/shared/types/provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ export enum ModelProviderEnum {
LMStudio = 'lm-studio',
Perplexity = 'perplexity',
XAI = 'xAI',
LLMApi = 'llmapi',
OpenRouter = 'openrouter',
Custom = 'custom',
}
Expand Down
1 change: 1 addition & 0 deletions test/integration/model-provider/model-provider.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,7 @@ const PROVIDER_TEST_MODELS: Record<ModelProvider, ProviderModelInfo[]> = {
{ modelId: 'anthropic/claude-haiku-4.5', capabilities: ['tool_use'] },
{ modelId: 'deepseek/deepseek-v3.2', capabilities: ['tool_use', 'reasoning'] },
],
[ModelProviderEnum.LLMApi]: [],
[ModelProviderEnum.Perplexity]: [],
[ModelProviderEnum.Custom]: [],
}
Expand Down