You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: solutions/search/agent-builder/models.md
-81Lines changed: 0 additions & 81 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -57,87 +57,6 @@ For programmatic access to connector management, refer to the [Connectors API do
57
57
58
58
{{agent-builder}} requires models with strong reasoning and tool-calling capabilities. State-of-the-art models perform significantly better than smaller or older models.
59
59
60
-
### Recommended model families
61
-
62
-
The following models are known to work well with {{agent-builder}}:
63
-
64
-
-**OpenAI**: GPT-4.1, GPT-4o
65
-
-**Anthropic**: Claude Sonnet 4.5, Claude Sonnet 4, Claude Sonnet 3.7
66
-
-**Google**: Gemini 2.5 Pro
67
-
68
-
### Why model quality matters
69
-
70
-
Agent Builder relies on advanced LLM capabilities including:
71
-
72
-
-**Function calling**: Models must accurately select appropriate tools and construct valid parameters from natural language requests
73
-
-**Multi-step reasoning**: Agents need to plan, execute, and adapt based on tool results across multiple iterations
74
-
-**Structured output**: Models must produce properly formatted responses that the agent framework can parse
75
-
76
-
Smaller or less capable models may produce errors like:
77
-
78
-
```console-response
79
-
Error: Invalid function call syntax
80
-
```
81
-
82
-
```console-response
83
-
Error executing agent: No tool calls found in the response.
84
-
```
85
-
86
-
While any chat-completion-compatible connector can technically be configured, we strongly recommend using state-of-the-art models for reliable agent performance.
87
-
88
-
:::{note}
89
-
GPT-4o-mini and similar smaller models are not recommended for {{agent-builder}} as they lack the necessary capabilities for reliable agent workflows.
90
-
:::
91
-
92
-
# Using different models in {{agent-builder}}
93
-
94
-
{{agent-builder}} uses large language models (LLMs) to power agent reasoning and decision-making. By default, agents use the Elastic Managed LLM, but you can configure other models through Kibana connectors.
95
-
96
-
## Default model configuration
97
-
98
-
By default, {{agent-builder}} uses the Elastic Managed LLM connector running on the [Elastic Inference Service](/explore-analyze/elastic-inference/eis.md) {applies_to}`serverless: preview` {applies_to}`ess: preview 9.2`.
99
-
100
-
This managed service requires zero setup and no additional API key management.
101
-
102
-
Learn more about the [Elastic Managed LLM connector](kibana://reference/connectors-kibana/elastic-managed-llm.md) and [pricing](https://www.elastic.co/pricing).
103
-
104
-
## Change the default model
105
-
106
-
By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different model, select a configured connector and set it as the default.
107
-
108
-
### Use a pre-configured connector
109
-
110
-
1. Search for **GenAI Settings** in the global search field
111
-
2. Select your preferred connector from the **Default AI Connector** dropdown
112
-
3. Save your changes
113
-
114
-
### Create a new connector in the UI
115
-
116
-
1. Find connectors under **Alerts and Insights / Connectors** in the [global search bar](/explore-analyze/find-and-organize/find-apps-and-objects.md)
117
-
2. Select **Create Connector** and select your model provider
118
-
3. Configure the connector with your API credentials and preferred model
119
-
4. Search for **GenAI Settings** in the global search field
120
-
5. Select your new connector from the **Default AI Connector** dropdown under **Custom connectors**
121
-
6. Save your changes
122
-
123
-
For detailed instructions on creating connectors, refer to [Connectors](https://www.elastic.co/docs/deploy-manage/manage-connectors).
124
-
125
-
Learn more about [preconfigured connectors](https://www.elastic.co/docs/reference/kibana/connectors-kibana/pre-configured-connectors).
126
-
127
-
#### Connect a local LLM
128
-
129
-
You can connect a locally hosted LLM to Elastic using the OpenAI connector. This requires your local LLM to be compatible with the OpenAI API format.
130
-
131
-
Refer to the [OpenAI connector documentation](kibana://reference/connectors-kibana/openai-action-type.md) for detailed setup instructions.
132
-
133
-
## Connectors API
134
-
135
-
For programmatic access to connector management, refer to the [Connectors API documentation]({{kib-serverless-apis}}group/endpoint-connectors).
136
-
137
-
## Recommended models
138
-
139
-
{{agent-builder}} requires models with strong reasoning and tool-calling capabilities. State-of-the-art models perform significantly better than smaller or older models.
140
-
141
60
The following models are known to work well with {{agent-builder}}:
0 commit comments