You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/models.md
+56-4Lines changed: 56 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -98,9 +98,25 @@ ECA allows you to configure custom LLM providers that follow API schemas similar
98
98
- Custom company LLM endpoints
99
99
- Additional cloud providers not natively supported
100
100
101
+
### API Types for Custom Providers
102
+
103
+
When configuring custom providers, choose the appropriate API type:
104
+
105
+
-**`openai-responses`**: OpenAI's new responses API endpoint (`/v1/responses`). Best for OpenAI models with enhanced features like reasoning and web search.
106
+
-**`openai-chat`**: Standard OpenAI Chat Completions API (`/v1/chat/completions`). Use this for most third-party providers:
107
+
- OpenRouter
108
+
- DeepSeek
109
+
- Together AI
110
+
- Groq
111
+
- Local LiteLLM servers
112
+
- Any OpenAI-compatible provider
113
+
-**`anthropic`**: Anthropic's native API for Claude models.
114
+
115
+
Most third-party providers use the `openai-chat` API for compatibility with existing tools and libraries.
116
+
101
117
### Setting up a custom provider
102
118
103
-
It's possible to configure ECA to be aware of custom LLM providers if they follow a API schema similar to currently supported ones (openai, anthropic), example for a custom hosted litellm server:
119
+
It's possible to configure ECA to be aware of custom LLM providers if they follow a API schema similar to currently supported ones (openai-responses, openai-chat or anthropic), example for a custom hosted litellm server:
104
120
105
121
Example:
106
122
@@ -109,7 +125,7 @@ Example:
109
125
{
110
126
"customProviders": {
111
127
"my-company": {
112
-
"api": "openai-responses",
128
+
"api": "openai-chat",
113
129
"urlEnv": "MY_COMPANY_API_URL", // or "url"
114
130
"keyEnv": "MY_COMPANY_API_KEY", // or "key"
115
131
"models": ["gpt-5", "deepseek-r1"],
@@ -123,7 +139,7 @@ Example:
123
139
124
140
| Option | Type | Description | Required |
125
141
|--------|------|-------------|----------|
126
-
|`api`| string | The API schema to use (`"openai-responses"` or `"anthropic"`) | Yes |
142
+
|`api`| string | The API schema to use (`"openai-responses"`, `"openai-chat"`, or `"anthropic"`) | Yes |
127
143
|`urlEnv`| string | Environment variable name containing the API URL | Yes*|
128
144
|`url`| string | Direct API URL (use instead of `urlEnv`) | Yes*|
129
145
|`keyEnv`| string | Environment variable name containing the API key | Yes*|
@@ -166,4 +182,40 @@ _* Either the `url` or `urlEnv` option is required, and either the `key` or `key
166
182
}
167
183
```
168
184
169
-
After configuring custom providers, the models will be available as `provider/model` (e.g., `litellm/gpt-5`, `enterprise/claude-3-opus-20240229`).
185
+
### Example: OpenRouter
186
+
187
+
[OpenRouter](https://openrouter.ai) provides access to many models through a unified API:
After configuring custom providers, the models will be available as `provider/model` (e.g., `openrouter/anthropic/claude-3.5-sonnet`, `deepseek/deepseek-chat`).
0 commit comments