Skip to content

Commit 9f0b5f9

Browse files
authored
Merge pull request #54 from editor-code-assistant/feature/support-openai-chat-api
Add `openai-chat` api for custom providers
2 parents 5c28411 + 19147da commit 9f0b5f9

File tree

6 files changed

+490
-7
lines changed

6 files changed

+490
-7
lines changed

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,9 @@
44

55
- Change api for custom providers to support `openai-responses` instead of just `openai`, still supporting `openai` only.
66
- Add limit to repoMap with default of 800 total entries and 50 per dir. #35
7+
- Add support for OpenAI Chat Completions API for broad third-party model support.
8+
- A new `openai-chat` custom provider `api` type was added to support any provider using the standard OpenAI `/v1/chat/completions` endpoint.
9+
- This enables easy integration with services like OpenRouter, Groq, DeepSeek, Together AI, and local LiteLLM instances.
710

811
## 0.27.0
912

docs/configuration.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -167,10 +167,10 @@ interface Config {
167167
mcpServers: {[key: string]: {
168168
command: string;
169169
args?: string[];
170-
disabled?: boolean;
170+
disabled?: boolean;
171171
}};
172172
customProviders: {[key: string]: {
173-
api: 'openai' | 'anthropic';
173+
api: 'openai-responses' | 'openai-chat' | 'anthropic';
174174
models: string[];
175175
defaultModel?: string;
176176
url?: string;
@@ -214,7 +214,7 @@ interface Config {
214214
"rules" : [],
215215
"commands" : [],
216216
"nativeTools": {"filesystem": {"enabled": true},
217-
"shell": {"enabled": true,
217+
"shell": {"enabled": true,
218218
"excludeCommands": []}},
219219
"disabledTools": [],
220220
"toolCall": {

docs/models.md

Lines changed: 56 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -98,9 +98,25 @@ ECA allows you to configure custom LLM providers that follow API schemas similar
9898
- Custom company LLM endpoints
9999
- Additional cloud providers not natively supported
100100

101+
### API Types for Custom Providers
102+
103+
When configuring custom providers, choose the appropriate API type:
104+
105+
- **`openai-responses`**: OpenAI's new responses API endpoint (`/v1/responses`). Best for OpenAI models with enhanced features like reasoning and web search.
106+
- **`openai-chat`**: Standard OpenAI Chat Completions API (`/v1/chat/completions`). Use this for most third-party providers:
107+
- OpenRouter
108+
- DeepSeek
109+
- Together AI
110+
- Groq
111+
- Local LiteLLM servers
112+
- Any OpenAI-compatible provider
113+
- **`anthropic`**: Anthropic's native API for Claude models.
114+
115+
Most third-party providers use the `openai-chat` API for compatibility with existing tools and libraries.
116+
101117
### Setting up a custom provider
102118

103-
It's possible to configure ECA to be aware of custom LLM providers if they follow a API schema similar to currently supported ones (openai, anthropic), example for a custom hosted litellm server:
119+
It's possible to configure ECA to be aware of custom LLM providers if they follow a API schema similar to currently supported ones (openai-responses, openai-chat or anthropic), example for a custom hosted litellm server:
104120

105121
Example:
106122

@@ -109,7 +125,7 @@ Example:
109125
{
110126
"customProviders": {
111127
"my-company": {
112-
"api": "openai-responses",
128+
"api": "openai-chat",
113129
"urlEnv": "MY_COMPANY_API_URL", // or "url"
114130
"keyEnv": "MY_COMPANY_API_KEY", // or "key"
115131
"models": ["gpt-5", "deepseek-r1"],
@@ -123,7 +139,7 @@ Example:
123139

124140
| Option | Type | Description | Required |
125141
|--------|------|-------------|----------|
126-
| `api` | string | The API schema to use (`"openai-responses"` or `"anthropic"`) | Yes |
142+
| `api` | string | The API schema to use (`"openai-responses"`, `"openai-chat"`, or `"anthropic"`) | Yes |
127143
| `urlEnv` | string | Environment variable name containing the API URL | Yes* |
128144
| `url` | string | Direct API URL (use instead of `urlEnv`) | Yes* |
129145
| `keyEnv` | string | Environment variable name containing the API key | Yes* |
@@ -166,4 +182,40 @@ _* Either the `url` or `urlEnv` option is required, and either the `key` or `key
166182
}
167183
```
168184

169-
After configuring custom providers, the models will be available as `provider/model` (e.g., `litellm/gpt-5`, `enterprise/claude-3-opus-20240229`).
185+
### Example: OpenRouter
186+
187+
[OpenRouter](https://openrouter.ai) provides access to many models through a unified API:
188+
189+
```json
190+
{
191+
"customProviders": {
192+
"openrouter": {
193+
"api": "openai-chat",
194+
"url": "https://openrouter.ai/api/v1",
195+
"keyEnv": "OPENROUTER_API_KEY",
196+
"models": ["anthropic/claude-3.5-sonnet", "openai/gpt-4-turbo", "meta-llama/llama-3.1-405b"],
197+
"defaultModel": "anthropic/claude-3.5-sonnet"
198+
}
199+
}
200+
}
201+
```
202+
203+
### Example: DeepSeek
204+
205+
[DeepSeek](https://deepseek.com) offers powerful reasoning and coding models:
206+
207+
```json
208+
{
209+
"customProviders": {
210+
"deepseek": {
211+
"api": "openai-chat",
212+
"url": "https://api.deepseek.com",
213+
"keyEnv": "DEEPSEEK_API_KEY",
214+
"models": ["deepseek-chat", "deepseek-coder", "deepseek-reasoner"],
215+
"defaultModel": "deepseek-chat"
216+
}
217+
}
218+
}
219+
```
220+
221+
After configuring custom providers, the models will be available as `provider/model` (e.g., `openrouter/anthropic/claude-3.5-sonnet`, `deepseek/deepseek-chat`).

src/eca/llm_api.clj

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
[eca.llm-providers.anthropic :as llm-providers.anthropic]
77
[eca.llm-providers.ollama :as llm-providers.ollama]
88
[eca.llm-providers.openai :as llm-providers.openai]
9+
[eca.llm-providers.openai-chat :as llm-providers.openai-chat]
910
[eca.logger :as logger]))
1011

1112
(set! *warn-on-reflection* true)
@@ -182,6 +183,7 @@
182183
("openai-responses"
183184
"openai") llm-providers.openai/completion!
184185
"anthropic" llm-providers.anthropic/completion!
186+
"openai-chat" llm-providers.openai-chat/completion!
185187
(on-error-wrapper {:message (format "Unknown custom model %s for provider %s" (:api provider-config) provider)}))
186188
url (or (:url provider-config) (config/get-env (:urlEnv provider-config)))
187189
key (or (:key provider-config) (config/get-env (:keyEnv provider-config)))

0 commit comments

Comments
 (0)