Skip to content

Commit 3c833c3

Browse files
authored
Merge pull request #48 from Naomarik/master
Update docs for adding models, bb doc task
2 parents 9a4884b + cc83cac commit 3c833c3

File tree

6 files changed

+172
-4
lines changed

6 files changed

+172
-4
lines changed

.github/workflows/docs.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ jobs:
1818
run: |
1919
cp -rf CHANGELOG.md README.md images docs
2020
docker login docker.pkg.github.com --username $GITHUB_ACTOR --password ${{ secrets.GITHUB_TOKEN }}
21-
docker run --rm -v ${PWD}:/docs docker.pkg.github.com/clojure-lsp/docs-image/docs-image -- build
21+
docker run --rm -v ${PWD}:/docs ghcr.io/editor-code-assistant/docs-image/docs-image -- build
2222
- name: Deploy
2323
uses: peaceiris/actions-gh-pages@v3
2424
with:

.gitignore

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,3 +29,7 @@ result
2929

3030
# emacs
3131
*~
32+
33+
# docs symlinks
34+
docs/README.md
35+
docs/CHANGELOG.md

README.md

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -48,13 +48,38 @@ With the LLMs models race, the differences between them tend to be irrelevant in
4848

4949
## Getting started
5050

51+
### 1. Install the editor plugin
52+
5153
Install the plugin for your editor and ECA server will be downloaded and started automatically:
5254

5355
- [Emacs](https://github.com/editor-code-assistant/eca-emacs)
5456
- [VsCode](https://github.com/editor-code-assistant/eca-vscode)
5557
- [Vim](https://github.com/editor-code-assistant/eca-nvim)
5658
- Intellij: Planned, help welcome
5759

60+
### 2. Set up your first model
61+
62+
To use ECA, you need to configure at least one model with your API key. See the [Models documentation](./models#adding-and-configuring-models) for detailed instructions on:
63+
64+
- Setting up API keys for OpenAI, Anthropic, or Ollama
65+
- Adding and customizing models
66+
- Configuring custom providers
67+
68+
**Quick start**: Create a `.eca/config.json` file in your project root with your API key:
69+
70+
```json
71+
{
72+
"openaiApiKey": "your-openai-api-key-here",
73+
"anthropicApiKey": "your-anthropic-api-key-here"
74+
}
75+
```
76+
77+
**Note**: For other providers or custom models, see the [custom providers documentation](./models#setting-up-a-custom-provider).
78+
79+
### 3. Start chatting
80+
81+
Once your model is configured, you can start using ECA's chat interface in your editor to ask questions, review code, and work together on your project.
82+
5883
## How it works
5984

6085
Editors spawn the server via `eca server` and communicate via stdin/stdout, similar to LSPs. Supported editors already download latest server on start and require no extra configuration.

bb.edn

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,4 +16,6 @@
1616

1717
tag make/tag
1818
get-last-changelog-entry make/get-last-changelog-entry
19-
integration-test make/integration-test}}
19+
integration-test make/integration-test
20+
21+
docs make/local-webpage}}

docs/models.md

Lines changed: 131 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,135 @@ The models capabilities and configurations are retrieved from [models.dev](https
3434

3535
Just configure the model in your eca `models` config, for more details check its [configuration](./configuration.md#adding-models).
3636

37-
## Custom providers
37+
## Adding and Configuring Models
3838

39-
ECA support configure extra LLM providers via `customProviders` config, for more details check [configuration](./configuration.md#custom-llm-providers).
39+
### Setting up your first model
40+
41+
To start using ECA, you need to configure at least one model with your API key. Here's how to set up a model:
42+
43+
1. **Choose your model**: Pick from [OpenAI](#openai), [Anthropic](#anthropic), or [Ollama](#ollama) models
44+
2. **Set your API key**: Create a configuration file with your credentials
45+
3. **Start using ECA**: The model will be available in your editor
46+
47+
### Setting up API keys
48+
49+
Create a configuration file at `.eca/config.json` in your project root or at `~/.config/eca/config.json` globally:
50+
51+
```json
52+
{
53+
"openaiApiKey": "your-openai-api-key-here",
54+
"anthropicApiKey": "your-anthropic-api-key-here"
55+
}
56+
```
57+
58+
**Environment Variables**: You can also set API keys using environment variables:
59+
- `OPENAI_API_KEY` for OpenAI
60+
- `ANTHROPIC_API_KEY` for Anthropic
61+
62+
### Adding new models
63+
64+
You can add new models or override existing ones in your configuration:
65+
66+
```json
67+
{
68+
"openaiApiKey": "your-openai-api-key-here",
69+
"models": {
70+
"gpt-5": {},
71+
"claude-3-5-sonnet-20241022": {}
72+
}
73+
}
74+
```
75+
76+
### Customizing model behavior
77+
78+
You can customize model parameters like temperature, reasoning effort, etc.:
79+
80+
```json
81+
{
82+
"openaiApiKey": "your-openai-api-key-here",
83+
"models": {
84+
"gpt-5": {
85+
"extraPayload": {
86+
"temperature": 0.7,
87+
"reasoning_effort": "high",
88+
"max_tokens": 4000
89+
}
90+
}
91+
}
92+
}
93+
```
94+
95+
## Custom model providers
96+
97+
ECA allows you to configure custom LLM providers that follow API schemas similar to OpenAI or Anthropic. This is useful when you want to use:
98+
99+
- Self-hosted LLM servers (like LiteLLM)
100+
- Custom company LLM endpoints
101+
- Additional cloud providers not natively supported
102+
103+
### Setting up a custom provider
104+
105+
Add a `customProviders` section to your `.eca/config.json` file:
106+
107+
```json
108+
{
109+
"customProviders": {
110+
"my-company": {
111+
"api": "openai",
112+
"urlEnv": "MY_COMPANY_API_URL",
113+
"keyEnv": "MY_COMPANY_API_KEY",
114+
"models": ["gpt-5", "deepseek-r1"],
115+
"defaultModel": "deepseek-r1"
116+
}
117+
}
118+
}
119+
```
120+
121+
### Custom provider configuration options
122+
123+
| Option | Type | Description | Required |
124+
|--------|------|-------------|----------|
125+
| `api` | string | The API schema to use (`"openai"` or `"anthropic"`) | Yes |
126+
| `urlEnv` | string | Environment variable name containing the API URL | Yes* |
127+
| `url` | string | Direct API URL (use instead of `urlEnv`) | Yes* |
128+
| `keyEnv` | string | Environment variable name containing the API key | Yes* |
129+
| `key` | string | Direct API key (use instead of `keyEnv`) | Yes* |
130+
| `models` | array | List of available model names | Yes |
131+
| `defaultModel` | string | Default model to use | No |
132+
| `completionUrlRelativePath` | string | Custom endpoint path for completions | No |
133+
134+
_* Either the `url` or `urlEnv` option is required, and either the `key` or `keyEnv` option is required._
135+
136+
### Example: Custom LiteLLM server
137+
138+
```json
139+
{
140+
"customProviders": {
141+
"litellm": {
142+
"api": "openai",
143+
"url": "https://litellm.my-company.com",
144+
"key": "your-api-key",
145+
"models": ["gpt-5", "claude-3-sonnet-20240229", "llama-3-70b"],
146+
"defaultModel": "gpt-5"
147+
}
148+
}
149+
}
150+
```
151+
152+
### Example: Using environment variables
153+
154+
```json
155+
{
156+
"customProviders": {
157+
"enterprise": {
158+
"api": "anthropic",
159+
"urlEnv": "ENTERPRISE_LLM_URL",
160+
"keyEnv": "ENTERPRISE_LLM_KEY",
161+
"models": ["claude-3-opus-20240229", "claude-3-sonnet-20240229"],
162+
"defaultModel": "claude-3-sonnet-20240229"
163+
}
164+
}
165+
}
166+
```
167+
168+
After configuring custom providers, the models will be available as `provider/model` (e.g., `litellm/gpt-5`, `enterprise/claude-3-opus-20240229`).

scripts/make.clj

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -106,3 +106,11 @@
106106
1 (entrypoint/run-all (str (first eca-bins-found)))
107107
(throw (ex-info "More than one eca executables found. Can only work with one."
108108
{:bin-found eca-bins-found})))))
109+
110+
(defn local-webpage []
111+
(let [files ["CHANGELOG.md" "README.md"]]
112+
(doseq [f files]
113+
(fs/copy f "docs" {:replace-existing true}))
114+
(fs/copy-tree "images" "docs" {:replace-existing true})
115+
(p/shell "docker login docker.pkg.github.com")
116+
(p/shell (str "docker run --rm -it -p 8000:8000 -v " (fs/cwd) ":/docs ghcr.io/editor-code-assistant/docs-image/docs-image"))))

0 commit comments

Comments
 (0)