You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To use ECA, you need to configure at least one model with your API key. See the [Models documentation](./models#adding-and-configuring-models) for detailed instructions on:
63
+
64
+
- Setting up API keys for OpenAI, Anthropic, or Ollama
65
+
- Adding and customizing models
66
+
- Configuring custom providers
67
+
68
+
**Quick start**: Create a `.eca/config.json` file in your project root with your API key:
69
+
70
+
```json
71
+
{
72
+
"openaiApiKey": "your-openai-api-key-here",
73
+
"anthropicApiKey": "your-anthropic-api-key-here"
74
+
}
75
+
```
76
+
77
+
**Note**: For other providers or custom models, see the [custom providers documentation](./models#setting-up-a-custom-provider).
78
+
79
+
### 3. Start chatting
80
+
81
+
Once your model is configured, you can start using ECA's chat interface in your editor to ask questions, review code, and work together on your project.
82
+
58
83
## How it works
59
84
60
85
Editors spawn the server via `eca server` and communicate via stdin/stdout, similar to LSPs. Supported editors already download latest server on start and require no extra configuration.
Copy file name to clipboardExpand all lines: docs/models.md
+131-2Lines changed: 131 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -34,6 +34,135 @@ The models capabilities and configurations are retrieved from [models.dev](https
34
34
35
35
Just configure the model in your eca `models` config, for more details check its [configuration](./configuration.md#adding-models).
36
36
37
-
## Custom providers
37
+
## Adding and Configuring Models
38
38
39
-
ECA support configure extra LLM providers via `customProviders` config, for more details check [configuration](./configuration.md#custom-llm-providers).
39
+
### Setting up your first model
40
+
41
+
To start using ECA, you need to configure at least one model with your API key. Here's how to set up a model:
42
+
43
+
1.**Choose your model**: Pick from [OpenAI](#openai), [Anthropic](#anthropic), or [Ollama](#ollama) models
44
+
2.**Set your API key**: Create a configuration file with your credentials
45
+
3.**Start using ECA**: The model will be available in your editor
46
+
47
+
### Setting up API keys
48
+
49
+
Create a configuration file at `.eca/config.json` in your project root or at `~/.config/eca/config.json` globally:
50
+
51
+
```json
52
+
{
53
+
"openaiApiKey": "your-openai-api-key-here",
54
+
"anthropicApiKey": "your-anthropic-api-key-here"
55
+
}
56
+
```
57
+
58
+
**Environment Variables**: You can also set API keys using environment variables:
59
+
-`OPENAI_API_KEY` for OpenAI
60
+
-`ANTHROPIC_API_KEY` for Anthropic
61
+
62
+
### Adding new models
63
+
64
+
You can add new models or override existing ones in your configuration:
65
+
66
+
```json
67
+
{
68
+
"openaiApiKey": "your-openai-api-key-here",
69
+
"models": {
70
+
"gpt-5": {},
71
+
"claude-3-5-sonnet-20241022": {}
72
+
}
73
+
}
74
+
```
75
+
76
+
### Customizing model behavior
77
+
78
+
You can customize model parameters like temperature, reasoning effort, etc.:
79
+
80
+
```json
81
+
{
82
+
"openaiApiKey": "your-openai-api-key-here",
83
+
"models": {
84
+
"gpt-5": {
85
+
"extraPayload": {
86
+
"temperature": 0.7,
87
+
"reasoning_effort": "high",
88
+
"max_tokens": 4000
89
+
}
90
+
}
91
+
}
92
+
}
93
+
```
94
+
95
+
## Custom model providers
96
+
97
+
ECA allows you to configure custom LLM providers that follow API schemas similar to OpenAI or Anthropic. This is useful when you want to use:
98
+
99
+
- Self-hosted LLM servers (like LiteLLM)
100
+
- Custom company LLM endpoints
101
+
- Additional cloud providers not natively supported
102
+
103
+
### Setting up a custom provider
104
+
105
+
Add a `customProviders` section to your `.eca/config.json` file:
106
+
107
+
```json
108
+
{
109
+
"customProviders": {
110
+
"my-company": {
111
+
"api": "openai",
112
+
"urlEnv": "MY_COMPANY_API_URL",
113
+
"keyEnv": "MY_COMPANY_API_KEY",
114
+
"models": ["gpt-5", "deepseek-r1"],
115
+
"defaultModel": "deepseek-r1"
116
+
}
117
+
}
118
+
}
119
+
```
120
+
121
+
### Custom provider configuration options
122
+
123
+
| Option | Type | Description | Required |
124
+
|--------|------|-------------|----------|
125
+
|`api`| string | The API schema to use (`"openai"` or `"anthropic"`) | Yes |
126
+
|`urlEnv`| string | Environment variable name containing the API URL | Yes*|
127
+
|`url`| string | Direct API URL (use instead of `urlEnv`) | Yes*|
128
+
|`keyEnv`| string | Environment variable name containing the API key | Yes*|
129
+
|`key`| string | Direct API key (use instead of `keyEnv`) | Yes*|
130
+
|`models`| array | List of available model names | Yes |
131
+
|`defaultModel`| string | Default model to use | No |
132
+
|`completionUrlRelativePath`| string | Custom endpoint path for completions | No |
133
+
134
+
_* Either the `url` or `urlEnv` option is required, and either the `key` or `keyEnv` option is required._
0 commit comments