From 2e2f8156aae86209ff749b6ab4a4ac27b908e2a6 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Istv=C3=A1n=20Zolt=C3=A1n=20Szab=C3=B3?= Date: Thu, 27 Mar 2025 14:54:58 +0100 Subject: [PATCH 1/6] [Solutions] Adds info about Elastic LLM. --- solutions/search/rag/playground.md | 18 +++++++++++------- 1 file changed, 11 insertions(+), 7 deletions(-) diff --git a/solutions/search/rag/playground.md b/solutions/search/rag/playground.md index 3f6bf77a83..67ac793ff2 100644 --- a/solutions/search/rag/playground.md +++ b/solutions/search/rag/playground.md @@ -31,8 +31,6 @@ Watch these video tutorials to help you get started: :::: - - ## How Playground works [playground-how-it-works] Here’s a simpified overview of how Playground works: @@ -61,7 +59,12 @@ Here’s a simpified overview of how Playground works: * User can also **Download the code** to integrate into application +## Elastic LLM [elastic-llm] +Playground includes a preconfigured LLM connector that you can use out of the box. +Using the Elastic LLM enables you to use Playground without having an account with an LLM provider or setting up an LLM connector. +Elastic LLM is hosted as a service and may incur additional costs. +For more details, refer to the [pricing page](https://www.elastic.co/pricing). ## Availability and prerequisites [playground-availability-prerequisites] @@ -76,7 +79,7 @@ To use Playground, you’ll need the following: * See [ingest data](playground.md#playground-getting-started-ingest) if you’d like to ingest sample data. -3. An account with a **supported LLM provider**. Playground supports the following: +3. An account with a **supported LLM provider** or you can use the preconfigured Elastic LLM. Playground supports the following: * **Amazon Bedrock** @@ -99,7 +102,6 @@ To use Playground, you’ll need the following: * Google Gemini 1.5 Pro * Google Gemini 1.5 Flash - ::::{tip} :name: playground-local-llms @@ -110,8 +112,6 @@ You can also use locally hosted LLMs that are compatible with the OpenAI SDK. On :::: - - ## Getting started [playground-getting-started] :::{image} /solutions/images/kibana-get-started.png @@ -119,9 +119,13 @@ You can also use locally hosted LLMs that are compatible with the OpenAI SDK. On :screenshot: ::: - ### Connect to LLM provider [playground-getting-started-connect] +:::{note} +If you use the preconfigured [Elastic LLM](#elastic-llm) connector, you can skip this step. Your LLM connector is ready to use. + +::: + To get started with Playground, you need to create a [connector](../../../deploy-manage/manage-connectors.md) for your LLM provider. You can also connect to [locally hosted LLMs](playground.md#playground-local-llms) which are compatible with the OpenAI API, by using the OpenAI connector. To connect to an LLM provider, follow these steps on the Playground landing page: From 6a61bab9f62b8b3d89734ff7be1aa7256bf65a26 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Istv=C3=A1n=20Zolt=C3=A1n=20Szab=C3=B3?= Date: Thu, 27 Mar 2025 15:46:04 +0100 Subject: [PATCH 2/6] More edits. --- solutions/_snippets/elastic-llm.md | 4 ++++ .../observability/observability-ai-assistant.md | 13 +++++++++++-- solutions/search/rag/playground.md | 10 ++++------ 3 files changed, 19 insertions(+), 8 deletions(-) create mode 100644 solutions/_snippets/elastic-llm.md diff --git a/solutions/_snippets/elastic-llm.md b/solutions/_snippets/elastic-llm.md new file mode 100644 index 0000000000..a92b5fd2bd --- /dev/null +++ b/solutions/_snippets/elastic-llm.md @@ -0,0 +1,4 @@ +Elastic LLM is a preconfigured LLM connector that you can use out of the box. +Using Elastic LLM enables you to use features such as Playground and AI Assistant without having an account with an LLM provider or setting up an LLM connector. +Elastic LLM is hosted as a service and may incur additional costs. +For more details, refer to the [pricing page](https://www.elastic.co/pricing). diff --git a/solutions/observability/observability-ai-assistant.md b/solutions/observability/observability-ai-assistant.md index ae24157956..c42666edf0 100644 --- a/solutions/observability/observability-ai-assistant.md +++ b/solutions/observability/observability-ai-assistant.md @@ -20,7 +20,8 @@ The AI Assistant uses generative AI to provide: :screenshot: ::: -The AI Assistant integrates with your large language model (LLM) provider through our supported {{stack}} connectors: +The AI Assistant includes Elastic LLM, a preconfigured large language model (LLM) connector. +It also integrates with your LLM provider through our supported {{stack}} connectors: * [OpenAI connector](kibana://reference/connectors-kibana/openai-action-type.md) for OpenAI or Azure OpenAI Service. * [Amazon Bedrock connector](kibana://reference/connectors-kibana/bedrock-action-type.md) for Amazon Bedrock, specifically for the Claude models. @@ -37,7 +38,10 @@ Also, the data you provide to the Observability AI assistant is *not* anonymized :::: +## Elastic LLM [elastic-llm-ai-assistant] +:::{include} ../_snippets/elastic-llm.md +::: ## Requirements [obs-ai-requirements] @@ -45,7 +49,7 @@ The AI assistant requires the following: * {{stack}} version 8.9 and later. * A self-deployed connector service if [search connectors](elasticsearch://reference/search-connectors/self-managed-connectors.md) are used to populate external data into the knowledge base. -* An account with a third-party generative AI provider that preferably supports function calling. If your AI provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance. +* An account with a third-party generative AI provider that preferably supports function calling. If your AI provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance. If you use the preconfigured Elastic LLM connector, you don't need an account of a third-party genAI provider. Refer to the [connector documentation](../../deploy-manage/manage-connectors.md) for your provider to learn about supported and default models. @@ -75,6 +79,11 @@ Elastic does not control third-party tools, and assumes no responsibility or lia ## Set up the AI Assistant [obs-ai-set-up] +:::{note} +If you use the preconfigured [Elastic LLM](#elastic-llm-ai-assistant) connector, you can skip this step. Your LLM connector is ready to use. + +::: + To set up the AI Assistant: 1. Create an authentication key with your AI provider to authenticate requests from the AI Assistant. You’ll use this in the next step. Refer to your provider’s documentation for information about creating authentication keys: diff --git a/solutions/search/rag/playground.md b/solutions/search/rag/playground.md index 67ac793ff2..aa6774b082 100644 --- a/solutions/search/rag/playground.md +++ b/solutions/search/rag/playground.md @@ -59,12 +59,10 @@ Here’s a simpified overview of how Playground works: * User can also **Download the code** to integrate into application -## Elastic LLM [elastic-llm] +## Elastic LLM [elastic-llm-playground] -Playground includes a preconfigured LLM connector that you can use out of the box. -Using the Elastic LLM enables you to use Playground without having an account with an LLM provider or setting up an LLM connector. -Elastic LLM is hosted as a service and may incur additional costs. -For more details, refer to the [pricing page](https://www.elastic.co/pricing). +:::{include} ../../_snippets/elastic-llm.md +::: ## Availability and prerequisites [playground-availability-prerequisites] @@ -122,7 +120,7 @@ You can also use locally hosted LLMs that are compatible with the OpenAI SDK. On ### Connect to LLM provider [playground-getting-started-connect] :::{note} -If you use the preconfigured [Elastic LLM](#elastic-llm) connector, you can skip this step. Your LLM connector is ready to use. +If you use the preconfigured [Elastic LLM](#elastic-llm-playground) connector, you can skip this step. Your LLM connector is ready to use. ::: From a7299b79b2791573994a069d9ae2a769f39386a1 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Istv=C3=A1n=20Zolt=C3=A1n=20Szab=C3=B3?= Date: Thu, 27 Mar 2025 16:32:43 +0100 Subject: [PATCH 3/6] Apply suggestions from code review Co-authored-by: Liam Thompson <32779855+leemthompo@users.noreply.github.com> --- solutions/observability/observability-ai-assistant.md | 4 ++-- solutions/search/rag/playground.md | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/solutions/observability/observability-ai-assistant.md b/solutions/observability/observability-ai-assistant.md index c42666edf0..0d5a5a57c6 100644 --- a/solutions/observability/observability-ai-assistant.md +++ b/solutions/observability/observability-ai-assistant.md @@ -20,7 +20,7 @@ The AI Assistant uses generative AI to provide: :screenshot: ::: -The AI Assistant includes Elastic LLM, a preconfigured large language model (LLM) connector. +By default AI Assistant uses Elastic LLM, a preconfigured large language model (LLM) connector that works out of the box. It also integrates with your LLM provider through our supported {{stack}} connectors: * [OpenAI connector](kibana://reference/connectors-kibana/openai-action-type.md) for OpenAI or Azure OpenAI Service. @@ -49,7 +49,7 @@ The AI assistant requires the following: * {{stack}} version 8.9 and later. * A self-deployed connector service if [search connectors](elasticsearch://reference/search-connectors/self-managed-connectors.md) are used to populate external data into the knowledge base. -* An account with a third-party generative AI provider that preferably supports function calling. If your AI provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance. If you use the preconfigured Elastic LLM connector, you don't need an account of a third-party genAI provider. +* If not using the default Elastic LLM connector, an account with a third-party generative AI provider that preferably supports function calling. If your provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance. Refer to the [connector documentation](../../deploy-manage/manage-connectors.md) for your provider to learn about supported and default models. diff --git a/solutions/search/rag/playground.md b/solutions/search/rag/playground.md index aa6774b082..de7738433c 100644 --- a/solutions/search/rag/playground.md +++ b/solutions/search/rag/playground.md @@ -77,7 +77,7 @@ To use Playground, you’ll need the following: * See [ingest data](playground.md#playground-getting-started-ingest) if you’d like to ingest sample data. -3. An account with a **supported LLM provider** or you can use the preconfigured Elastic LLM. Playground supports the following: +3. If not using the default Elastic LLM connector, an account with a supported LLM provider: * **Amazon Bedrock** From 37766edaccf9b373b3604878c220ac113ce0fee1 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Istv=C3=A1n=20Zolt=C3=A1n=20Szab=C3=B3?= Date: Thu, 27 Mar 2025 17:14:18 +0100 Subject: [PATCH 4/6] Refines wording to make it clear that it's enabled by default. --- solutions/_snippets/elastic-llm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/solutions/_snippets/elastic-llm.md b/solutions/_snippets/elastic-llm.md index a92b5fd2bd..833c99c1a4 100644 --- a/solutions/_snippets/elastic-llm.md +++ b/solutions/_snippets/elastic-llm.md @@ -1,4 +1,4 @@ -Elastic LLM is a preconfigured LLM connector that you can use out of the box. +Elastic LLM is a preconfigured LLM connector, enabled by default and ready to use out of the box. Using Elastic LLM enables you to use features such as Playground and AI Assistant without having an account with an LLM provider or setting up an LLM connector. Elastic LLM is hosted as a service and may incur additional costs. For more details, refer to the [pricing page](https://www.elastic.co/pricing). From 2f3cd0c757cd50af33a39c072ca6d831ddc7d6a1 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Istv=C3=A1n=20Zolt=C3=A1n=20Szab=C3=B3?= Date: Tue, 1 Apr 2025 17:08:17 +0200 Subject: [PATCH 5/6] Addresses feedback. --- solutions/_snippets/elastic-llm.md | 6 +++--- solutions/observability/observability-ai-assistant.md | 8 ++++---- solutions/search/rag/playground.md | 6 +++--- 3 files changed, 10 insertions(+), 10 deletions(-) diff --git a/solutions/_snippets/elastic-llm.md b/solutions/_snippets/elastic-llm.md index 833c99c1a4..c9b85cc97a 100644 --- a/solutions/_snippets/elastic-llm.md +++ b/solutions/_snippets/elastic-llm.md @@ -1,4 +1,4 @@ -Elastic LLM is a preconfigured LLM connector, enabled by default and ready to use out of the box. -Using Elastic LLM enables you to use features such as Playground and AI Assistant without having an account with an LLM provider or setting up an LLM connector. -Elastic LLM is hosted as a service and may incur additional costs. +An LLM is preconfigured as a connector, enabled by default and ready to use out of the box. +Using the preconfigured LLM enables you to use features such as Playground and AI Assistant without having an account with an LLM provider or setting up an LLM connector. +The LLM is hosted as a service and may incur additional costs. For more details, refer to the [pricing page](https://www.elastic.co/pricing). diff --git a/solutions/observability/observability-ai-assistant.md b/solutions/observability/observability-ai-assistant.md index 0d5a5a57c6..cebb245c70 100644 --- a/solutions/observability/observability-ai-assistant.md +++ b/solutions/observability/observability-ai-assistant.md @@ -20,7 +20,7 @@ The AI Assistant uses generative AI to provide: :screenshot: ::: -By default AI Assistant uses Elastic LLM, a preconfigured large language model (LLM) connector that works out of the box. +By default AI Assistant uses a preconfigured large language model (LLM) connector that works out of the box. It also integrates with your LLM provider through our supported {{stack}} connectors: * [OpenAI connector](kibana://reference/connectors-kibana/openai-action-type.md) for OpenAI or Azure OpenAI Service. @@ -38,7 +38,7 @@ Also, the data you provide to the Observability AI assistant is *not* anonymized :::: -## Elastic LLM [elastic-llm-ai-assistant] +## Preconfigured LLM [preconfigured-llm-ai-assistant] :::{include} ../_snippets/elastic-llm.md ::: @@ -49,7 +49,7 @@ The AI assistant requires the following: * {{stack}} version 8.9 and later. * A self-deployed connector service if [search connectors](elasticsearch://reference/search-connectors/self-managed-connectors.md) are used to populate external data into the knowledge base. -* If not using the default Elastic LLM connector, an account with a third-party generative AI provider that preferably supports function calling. If your provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance. +* If not using the preconfigured default LLM connector, you need an account with a third-party generative AI provider that preferably supports function calling. If your provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance. Refer to the [connector documentation](../../deploy-manage/manage-connectors.md) for your provider to learn about supported and default models. @@ -80,7 +80,7 @@ Elastic does not control third-party tools, and assumes no responsibility or lia ## Set up the AI Assistant [obs-ai-set-up] :::{note} -If you use the preconfigured [Elastic LLM](#elastic-llm-ai-assistant) connector, you can skip this step. Your LLM connector is ready to use. +If you use [the preconfigured LLM](#preconfigured-llm-ai-assistant) connector, you can skip this step. Your LLM connector is ready to use. ::: diff --git a/solutions/search/rag/playground.md b/solutions/search/rag/playground.md index de7738433c..f7247ddb33 100644 --- a/solutions/search/rag/playground.md +++ b/solutions/search/rag/playground.md @@ -59,7 +59,7 @@ Here’s a simpified overview of how Playground works: * User can also **Download the code** to integrate into application -## Elastic LLM [elastic-llm-playground] +## Elastic LLM [preconfigured-llm-playground] :::{include} ../../_snippets/elastic-llm.md ::: @@ -77,7 +77,7 @@ To use Playground, you’ll need the following: * See [ingest data](playground.md#playground-getting-started-ingest) if you’d like to ingest sample data. -3. If not using the default Elastic LLM connector, an account with a supported LLM provider: +3. If not using the default preconfigured LLM connector, you will need an account with a supported LLM provider: * **Amazon Bedrock** @@ -120,7 +120,7 @@ You can also use locally hosted LLMs that are compatible with the OpenAI SDK. On ### Connect to LLM provider [playground-getting-started-connect] :::{note} -If you use the preconfigured [Elastic LLM](#elastic-llm-playground) connector, you can skip this step. Your LLM connector is ready to use. +If you use [the preconfigured LLM](#preconfigured-llm-playground) connector, you can skip this step. Your LLM connector is ready to use. ::: From 70da9223d13f2d10d3ed6a6870695ead007badf0 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Istv=C3=A1n=20Zolt=C3=A1n=20Szab=C3=B3?= Date: Tue, 1 Apr 2025 17:19:11 +0200 Subject: [PATCH 6/6] Update solutions/_snippets/elastic-llm.md --- solutions/_snippets/elastic-llm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/solutions/_snippets/elastic-llm.md b/solutions/_snippets/elastic-llm.md index c9b85cc97a..6d1f1dccc6 100644 --- a/solutions/_snippets/elastic-llm.md +++ b/solutions/_snippets/elastic-llm.md @@ -1,4 +1,4 @@ An LLM is preconfigured as a connector, enabled by default and ready to use out of the box. Using the preconfigured LLM enables you to use features such as Playground and AI Assistant without having an account with an LLM provider or setting up an LLM connector. -The LLM is hosted as a service and may incur additional costs. +The LLM is hosted as a service and will incur additional costs. For more details, refer to the [pricing page](https://www.elastic.co/pricing).