You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|`api`| string | The API schema to use (`"openai-responses"`, `"openai-chat"`, or `"anthropic"`) | Yes |
64
+
|`urlEnv`| string | Environment variable name containing the API URL | Yes*|
65
+
|`url`| string | Direct API URL (use instead of `urlEnv`) | Yes*|
66
+
|`keyEnv`| string | Environment variable name containing the API key | Yes*|
67
+
|`key`| string | Direct API key (use instead of `keyEnv`) | Yes*|
68
+
|`models`| map | Key: model name, value: its config | Yes |
69
+
|`models <model> extraPayload`| map | Extra payload sent in body to LLM | No |
57
70
71
+
Example:
72
+
73
+
`~/.config/eca/config.json`
58
74
```javascript
59
75
{
60
76
"providers": {
61
-
"openai": {
62
-
"key":"your-openai-api-key-here",
63
-
"gpt-5": {
64
-
"extraPayload": {
65
-
"temperature":0.7,
66
-
"reasoning_effort":"high",
67
-
"max_tokens":4000
68
-
}
69
-
}
70
-
}
77
+
"my-company": {
78
+
"api":"openai-chat",
79
+
"urlEnv":"MY_COMPANY_API_URL", // or "url"
80
+
"keyEnv":"MY_COMPANY_API_KEY", // or "key"
81
+
"models": {
82
+
"gpt-5": {},
83
+
"deepseek-r1": {}
84
+
}
85
+
}
71
86
}
72
87
}
73
88
```
74
89
75
-
This config will be merged with current default used by ECA.
76
-
77
-
## Custom model providers
78
-
79
-
ECA allows you to configure custom LLM providers that follow API schemas similar to OpenAI or Anthropic. This is useful when you want to use:
80
-
81
-
- Self-hosted LLM servers (like LiteLLM)
82
-
- Custom company LLM endpoints
83
-
- Additional cloud providers not natively supported
84
-
85
-
### API Types for Custom Providers
90
+
### API Types
86
91
87
92
When configuring custom providers, choose the appropriate API type:
88
93
@@ -98,44 +103,7 @@ When configuring custom providers, choose the appropriate API type:
98
103
99
104
Most third-party providers use the `openai-chat` API for compatibility with existing tools and libraries.
100
105
101
-
### Setting up a custom provider
102
-
103
-
It's possible to configure ECA to be aware of custom LLM providers if they follow a API schema similar to currently supported ones (openai-responses, openai-chat or anthropic), example for a custom hosted litellm server:
104
-
105
-
Example:
106
-
107
-
`~/.config/eca/config.json`
108
-
```javascript
109
-
{
110
-
"customProviders": {
111
-
"my-company": {
112
-
"api":"openai-chat",
113
-
"urlEnv":"MY_COMPANY_API_URL", // or "url"
114
-
"keyEnv":"MY_COMPANY_API_KEY", // or "key"
115
-
"models": ["gpt-5", "deepseek-r1"]
116
-
}
117
-
}
118
-
}
119
-
```
120
-
121
-
### Custom provider configuration options
122
-
123
-
| Option | Type | Description | Required |
124
-
|--------|------|-------------|----------|
125
-
|`api`| string | The API schema to use (`"openai-responses"`, `"openai-chat"`, or `"anthropic"`) | Yes |
126
-
|`urlEnv`| string | Environment variable name containing the API URL | Yes*|
127
-
|`url`| string | Direct API URL (use instead of `urlEnv`) | Yes*|
128
-
|`keyEnv`| string | Environment variable name containing the API key | Yes*|
129
-
|`key`| string | Direct API key (use instead of `keyEnv`) | Yes*|
130
-
|`models`| array | List of available model names | Yes |
131
-
|`completionUrlRelativePath`| string | Custom endpoint path for completions | No |
132
-
133
-
_* Either the `url` or `urlEnv` option is required, and either the `key` or `keyEnv` option is required._
134
-
135
-
136
-
After configuring custom providers, the models will be available as `provider/model` (e.g., `openrouter/anthropic/claude-3.5-sonnet`, `deepseek/deepseek-chat`).
137
-
138
-
### Providers setup
106
+
## Providers
139
107
140
108
=== "Github Copilot"
141
109
@@ -149,12 +117,15 @@ After configuring custom providers, the models will be available as `provider/mo
0 commit comments