Skip to content

Conversation

@hakancog
Copy link

No description provided.

@lodos2005
Copy link

I added support for LiteLLM / local LLM endpoints as an additional provider.

Main motivation is enabling privacy-friendly local inference (and flexibility for self-hosted setups) without changing the rest of the app’s usage patterns.

I pushed this quickly, when you have a moment, I’d appreciate a review on the approach and any edge cases I might’ve missed.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds multi-provider support to allow users to choose between OpenRouter and Local AI/LiteLLM providers for API calls. The changes enable configuration of self-hosted LLMs like Ollama or LiteLLM proxies as alternatives to OpenRouter.

  • Added new ApiProvider type and LocalAiConfig interface to support multiple API providers
  • Updated service layer to handle different API endpoints and authentication based on provider selection
  • Enhanced UI with provider selection and provider-specific configuration fields

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
types.ts Adds ApiProvider type, LocalAiConfig interface, and extends ApiOptions with provider and baseUrl fields
services/Service.ts Implements provider-specific API URL resolution, conditional authentication, and multi-provider support in API calls
hooks/useApiOptions.ts Updates hook to determine API readiness and build options based on selected provider
contexts/SettingsProvider.tsx Adds state management for API provider selection and Local AI configuration with localStorage persistence
constants.ts Defines default Local AI settings and provider labels for UI display
components/SettingsModal.tsx Implements provider selection UI with conditional rendering of provider-specific settings forms
dockerizer.sh Improves Docker Compose command detection to support both docker compose and docker-compose

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

const [localApiProvider, setLocalApiProvider] = useState<ApiProvider>(globalApiProvider);
const [localApiKeys, setLocalApiKeys] = useState<ApiKeys>(globalApiKeys);
const [localOpenRouterModel, setLocalOpenRouterModel] = useState(globalOpenRouterModel);
const [localLocalAiConfig, setLocalLocalAiConfig] = useState<LocalAiConfig>(globalLocalAiConfig);
Copy link

Copilot AI Dec 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable name 'localLocalAiConfig' is redundant and confusing - 'local' appears twice. Consider renaming to 'localAiConfigState' or simply 'aiConfig' to improve readability.

Copilot uses AI. Check for mistakes.
}

if (provider !== 'localai' && !apiKey) {
return { success: false, error: 'API Key is required.' };
Copy link

Copilot AI Dec 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message 'API Key is required.' is generic and doesn't specify which provider requires it. When this error is shown, users might not understand the context. Consider making it more specific, such as 'API Key is required for OpenRouter provider.'

Suggested change
return { success: false, error: 'API Key is required.' };
return { success: false, error: `API Key is required for ${provider} provider.` };

Copilot uses AI. Check for mistakes.
Comment on lines +454 to 456
if (provider === 'openrouter' && apiKey && !apiKey.startsWith('sk-or-')) {
return { success: false, error: 'Invalid OpenRouter API key format. It should start with "sk-or-".' };
}
Copy link

Copilot AI Dec 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The validation logic checks if the API key starts with 'sk-or-' only when an apiKey is provided, but this validation is now conditional on the provider being 'openrouter'. However, if a user switches from LocalAI to OpenRouter with an empty key, then enters an invalid key format, the validation won't trigger until they click "Test API Connection". Consider moving this validation to run whenever the openrouter key changes, not just during API testing.

Copilot uses AI. Check for mistakes.
setIsKeyValidated(false);
setTestResult(null);
}}
placeholder={DEFAULT_LOCAL_AI_URL}
Copy link

Copilot AI Dec 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The placeholder text shows 'http://localhost:4000/v1/chat/completions' as an example but doesn't clarify that this is for LiteLLM specifically. Ollama typically uses port 11434 (as shown in the help text below). Consider making the placeholder more generic or updating it to reflect the most common use case.

Suggested change
placeholder={DEFAULT_LOCAL_AI_URL}
placeholder="http://localhost:11434/v1/chat/completions"

Copilot uses AI. Check for mistakes.
placeholder="Leave empty if not required"
className="w-full pl-4 pr-12 py-2 bg-control-bg border-2 border-control-border rounded-lg text-text-primary focus:ring-2 focus:ring-purple-500 focus:border-purple-500 focus:outline-none transition-colors"
/>
{isKeyValidated && (
Copy link

Copilot AI Dec 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The conditional check on line 203 checks both 'isKeyValidated && localApiKeys.openrouter' but on line 296 for localai provider, it only checks 'isKeyValidated'. This creates an inconsistency where the validation checkmark appears for LocalAI even when no API key is provided (which is valid for LocalAI), but might be confusing to users. Consider either removing the checkmark for LocalAI when no key is provided, or making the display logic consistent between providers.

Suggested change
{isKeyValidated && (
{isKeyValidated && localApiKeys.localai && (

Copilot uses AI. Check for mistakes.
Copy link
Contributor

@Acorzo1983 Acorzo1983 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @lodos2005! This looks good. Could you rename localLocalAiConfig to aiConfigState for better readability? Otherwise ready to merge.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants