Skip to content
4 changes: 4 additions & 0 deletions solutions/_snippets/elastic-llm.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
An LLM is preconfigured as a connector, enabled by default and ready to use out of the box.
Using the preconfigured LLM enables you to use features such as Playground and AI Assistant without having an account with an LLM provider or setting up an LLM connector.
The LLM is hosted as a service and may incur additional costs.
For more details, refer to the [pricing page](https://www.elastic.co/pricing).
13 changes: 11 additions & 2 deletions solutions/observability/observability-ai-assistant.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,8 @@ The AI Assistant uses generative AI to provide:
:screenshot:
:::

The AI Assistant integrates with your large language model (LLM) provider through our supported {{stack}} connectors:
By default AI Assistant uses a preconfigured large language model (LLM) connector that works out of the box.
It also integrates with your LLM provider through our supported {{stack}} connectors:

* [OpenAI connector](kibana://reference/connectors-kibana/openai-action-type.md) for OpenAI or Azure OpenAI Service.
* [Amazon Bedrock connector](kibana://reference/connectors-kibana/bedrock-action-type.md) for Amazon Bedrock, specifically for the Claude models.
Expand All @@ -37,15 +38,18 @@ Also, the data you provide to the Observability AI assistant is *not* anonymized

::::

## Preconfigured LLM [preconfigured-llm-ai-assistant]

:::{include} ../_snippets/elastic-llm.md
:::

## Requirements [obs-ai-requirements]

The AI assistant requires the following:

* {{stack}} version 8.9 and later.
* A self-deployed connector service if [search connectors](elasticsearch://reference/search-connectors/self-managed-connectors.md) are used to populate external data into the knowledge base.
* An account with a third-party generative AI provider that preferably supports function calling. If your AI provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance.
* If not using the preconfigured default LLM connector, you need an account with a third-party generative AI provider that preferably supports function calling. If your provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance.

Refer to the [connector documentation](../../deploy-manage/manage-connectors.md) for your provider to learn about supported and default models.

Expand Down Expand Up @@ -75,6 +79,11 @@ Elastic does not control third-party tools, and assumes no responsibility or lia

## Set up the AI Assistant [obs-ai-set-up]

:::{note}
If you use [the preconfigured LLM](#preconfigured-llm-ai-assistant) connector, you can skip this step. Your LLM connector is ready to use.

:::

To set up the AI Assistant:

1. Create an authentication key with your AI provider to authenticate requests from the AI Assistant. You’ll use this in the next step. Refer to your provider’s documentation for information about creating authentication keys:
Expand Down
16 changes: 9 additions & 7 deletions solutions/search/rag/playground.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,6 @@ Watch these video tutorials to help you get started:

::::



## How Playground works [playground-how-it-works]

Here’s a simpified overview of how Playground works:
Expand Down Expand Up @@ -61,7 +59,10 @@ Here’s a simpified overview of how Playground works:

* User can also **Download the code** to integrate into application

## Elastic LLM [preconfigured-llm-playground]

:::{include} ../../_snippets/elastic-llm.md
:::

## Availability and prerequisites [playground-availability-prerequisites]

Expand All @@ -76,7 +77,7 @@ To use Playground, you’ll need the following:

* See [ingest data](playground.md#playground-getting-started-ingest) if you’d like to ingest sample data.

3. An account with a **supported LLM provider**. Playground supports the following:
3. If not using the default preconfigured LLM connector, you will need an account with a supported LLM provider:

* **Amazon Bedrock**

Expand All @@ -99,7 +100,6 @@ To use Playground, you’ll need the following:
* Google Gemini 1.5 Pro
* Google Gemini 1.5 Flash


::::{tip}
:name: playground-local-llms

Expand All @@ -110,18 +110,20 @@ You can also use locally hosted LLMs that are compatible with the OpenAI SDK. On

::::



## Getting started [playground-getting-started]

:::{image} /solutions/images/kibana-get-started.png
:alt: get started
:screenshot:
:::


### Connect to LLM provider [playground-getting-started-connect]

:::{note}
If you use [the preconfigured LLM](#preconfigured-llm-playground) connector, you can skip this step. Your LLM connector is ready to use.

:::

To get started with Playground, you need to create a [connector](../../../deploy-manage/manage-connectors.md) for your LLM provider. You can also connect to [locally hosted LLMs](playground.md#playground-local-llms) which are compatible with the OpenAI API, by using the OpenAI connector.

To connect to an LLM provider, follow these steps on the Playground landing page:
Expand Down
Loading