Skip to content

Commit 2e29b3b

Browse files
committed
Link to AI connectors and LLM performance matrix
1 parent dae99f4 commit 2e29b3b

File tree

1 file changed

+14
-11
lines changed

1 file changed

+14
-11
lines changed

solutions/observability/observability-ai-assistant.md

Lines changed: 14 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -18,25 +18,28 @@ You can [interact with the AI Assistant](#obs-ai-interact) in two ways:
1818
* **Contextual insights**: Embedded assistance throughout Elastic UIs that explains errors and messages with suggested remediation steps.
1919
* **Chat interface**: A conversational experience where you can ask questions and receive answers about your data. The assistant uses function calling to request, analyze, and visualize information based on your needs.
2020

21-
The AI Assistant integrates with your large language model (LLM) provider through our supported {{stack}} connectors:
21+
The AI Assistant integrates with your large language model (LLM) provider through our supported {{stack}} connectors. Refer to following for more information:
22+
23+
- [Set up the AI Assistant](#obs-ai-set-up) for more on available AI connectors.
24+
- [{{obs-ai-assistant}} LLM performance matrix](./llm-performance-matrix.md) for supported third-party LLM providers and their ratings for different use cases.
2225

2326
## Use cases
2427

2528
The {{obs-ai-assistant}} helps you:
2629

27-
* **Decode error messages**: Interpret stack traces and error logs to pinpoint root causes
28-
* **Identify performance bottlenecks**: Find resource-intensive operations and slow queries in Elasticsearch
30+
* **Decode error messages**: Interpret stack traces and error logs to pinpoint root causes.
31+
* **Identify performance bottlenecks**: Find resource-intensive operations and slow queries in {{es}}
2932
* **Generate reports**: Create alert summaries and incident timelines with key metrics
30-
* **Build and execute queries**: Build Elasticsearch queries from natural language, convert Query DSL to ES|QL syntax, and execute queries directly from the chat interface
31-
* **Visualize data**: Create time-series charts and distribution graphs from your Elasticsearch data
33+
* **Build and execute queries**: Build {{es}} queries from natural language, convert Query DSL to {{esql}} syntax, and execute queries directly from the chat interface
34+
* **Visualize data**: Create time-series charts and distribution graphs from your {{es}} data
3235

3336
## Requirements [obs-ai-requirements]
3437

3538
The AI assistant requires the following:
3639

3740
- An **Elastic deployment**:
3841

39-
- For **Observability**: {{stack}} version **8.9** or later, or an **{{observability}} serverless project**.
42+
- For **{{observability}}**: {{stack}} version **8.9** or later, or an **{{observability}} serverless project**.
4043

4144
- For **Search**: {{stack}} version **8.16.0** or later, or **{{serverless-short}} {{es}} project**.
4245

@@ -62,9 +65,9 @@ serverless: ga
6265

6366
The [**GenAI settings**](/explore-analyze/manage-access-to-ai-assistant.md) page allows you to:
6467

65-
- Manage which AI connectors are available in your environment.
68+
- Manage which AI connectors are available in your environment.
6669
- Enable or disable AI Assistant and other AI-powered features in your environment.
67-
- {applies_to}`stack: ga 9.2` {applies_to}`serverless: unavailable` Specify in which Elastic solutions the `AI Assistant for Observability and Search` and the `AI Assistant for Security` appear.
70+
- {applies_to}`stack: ga 9.2` {applies_to}`serverless: unavailable` Specify in which Elastic solutions the `AI Assistant for {{observability}} and Search` and the `AI Assistant for Security` appear.
6871

6972
## Your data and the AI Assistant [data-information]
7073

@@ -98,11 +101,11 @@ The AI Assistant connects to one of these supported LLM providers:
98101

99102
**Setup steps**:
100103

101-
1. **Create authentication credentials** with your chosen provider using the links above.
104+
1. **Create authentication credentials** with your chosen provider using the links in the previous table.
102105
2. **Create an LLM connector** for your chosen provider by going to the **Connectors** management page in the navigation menu or by using the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md).
103106
3. **Authenticate the connection** by entering:
104-
- The provider's API endpoint URL
105-
- Your authentication key or secret
107+
- The provider's API endpoint URL.
108+
- Your authentication key or secret.
106109

107110
::::{admonition} Recommended models
108111
While the {{obs-ai-assistant}} is compatible with many different models, refer to the [Large language model performance matrix](/solutions/observability/llm-performance-matrix.md) to select models that perform well with your desired use cases.

0 commit comments

Comments
 (0)