You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/models.md
+11-9Lines changed: 11 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -44,7 +44,7 @@ To start using ECA, you need to configure at least one model with your API key.
44
44
45
45
Create a configuration file at `.eca/config.json` in your project root or at `~/.config/eca/config.json` globally:
46
46
47
-
```json
47
+
```javascript
48
48
{
49
49
"openaiApiKey":"your-openai-api-key-here",
50
50
"anthropicApiKey":"your-anthropic-api-key-here"
@@ -59,7 +59,7 @@ Create a configuration file at `.eca/config.json` in your project root or at `~/
59
59
60
60
You can add new models or override existing ones in your configuration:
61
61
62
-
```json
62
+
```javascript
63
63
{
64
64
"openaiApiKey":"your-openai-api-key-here",
65
65
"models": {
@@ -73,7 +73,7 @@ You can add new models or override existing ones in your configuration:
73
73
74
74
You can customize model parameters like temperature, reasoning effort, etc.:
75
75
76
-
```json
76
+
```javascript
77
77
{
78
78
"openaiApiKey":"your-openai-api-key-here",
79
79
"models": {
@@ -104,12 +104,14 @@ When configuring custom providers, choose the appropriate API type:
104
104
105
105
-**`openai-responses`**: OpenAI's new responses API endpoint (`/v1/responses`). Best for OpenAI models with enhanced features like reasoning and web search.
106
106
-**`openai-chat`**: Standard OpenAI Chat Completions API (`/v1/chat/completions`). Use this for most third-party providers:
107
+
107
108
- OpenRouter
108
109
- DeepSeek
109
110
- Together AI
110
111
- Groq
111
112
- Local LiteLLM servers
112
113
- Any OpenAI-compatible provider
114
+
113
115
-**`anthropic`**: Anthropic's native API for Claude models.
114
116
115
117
Most third-party providers use the `openai-chat` API for compatibility with existing tools and libraries.
@@ -120,8 +122,8 @@ It's possible to configure ECA to be aware of custom LLM providers if they follo
120
122
121
123
Example:
122
124
123
-
`~/.config/eca/config.json`
124
-
```json
125
+
`~/.config/eca/config.javascript`
126
+
```javascript
125
127
{
126
128
"customProviders": {
127
129
"my-company": {
@@ -152,7 +154,7 @@ _* Either the `url` or `urlEnv` option is required, and either the `key` or `key
152
154
153
155
### Example: Custom LiteLLM server
154
156
155
-
```json
157
+
```javascript
156
158
{
157
159
"customProviders": {
158
160
"litellm": {
@@ -168,7 +170,7 @@ _* Either the `url` or `urlEnv` option is required, and either the `key` or `key
168
170
169
171
### Example: Using environment variables
170
172
171
-
```json
173
+
```javascript
172
174
{
173
175
"customProviders": {
174
176
"enterprise": {
@@ -186,7 +188,7 @@ _* Either the `url` or `urlEnv` option is required, and either the `key` or `key
186
188
187
189
[OpenRouter](https://openrouter.ai) provides access to many models through a unified API:
188
190
189
-
```json
191
+
```javascript
190
192
{
191
193
"customProviders": {
192
194
"openrouter": {
@@ -204,7 +206,7 @@ _* Either the `url` or `urlEnv` option is required, and either the `key` or `key
204
206
205
207
[DeepSeek](https://deepseek.com) offers powerful reasoning and coding models:
0 commit comments