You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/models.md
+1-3Lines changed: 1 addition & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -102,18 +102,16 @@ ECA allows you to configure custom LLM providers that follow API schemas similar
102
102
103
103
When configuring custom providers, choose the appropriate API type:
104
104
105
+
-**`anthropic`**: Anthropic's native API for Claude models.
105
106
-**`openai-responses`**: OpenAI's new responses API endpoint (`/v1/responses`). Best for OpenAI models with enhanced features like reasoning and web search.
106
107
-**`openai-chat`**: Standard OpenAI Chat Completions API (`/v1/chat/completions`). Use this for most third-party providers:
107
-
108
108
- OpenRouter
109
109
- DeepSeek
110
110
- Together AI
111
111
- Groq
112
112
- Local LiteLLM servers
113
113
- Any OpenAI-compatible provider
114
114
115
-
-**`anthropic`**: Anthropic's native API for Claude models.
116
-
117
115
Most third-party providers use the `openai-chat` API for compatibility with existing tools and libraries.
0 commit comments