diff --git a/explore-analyze/ai-assistant.md b/explore-analyze/ai-assistant.md index ba826abb40..4be405eb65 100644 --- a/explore-analyze/ai-assistant.md +++ b/explore-analyze/ai-assistant.md @@ -51,3 +51,8 @@ In addition to practical advice, AI Assistant can offer conceptual advice, tips, Elastic does not use customer data for model training. This includes anything you send the model, such as alert or event data, detection rule configurations, queries, and prompts. However, any data you provide to AI Assistant will be processed by the third-party provider you chose when setting up the generative AI connector as part of the assistant setup. Elastic does not control third-party tools, and assumes no responsibility or liability for their content, operation, or use, nor for any loss or damage that may arise from your using such tools. Exercise caution when using AI tools with personal, sensitive, or confidential information. Any data you submit may be used by the provider for AI training or other purposes. There is no guarantee that the provider will keep any information you provide secure or confidential. You should familiarize yourself with the privacy practices and terms of use of any generative AI tools prior to use. + +## Elastic Managed LLM [elastic-managed-llm-ai-assistant] + +:::{include} ../solutions/_snippets/elastic-managed-llm.md +::: diff --git a/solutions/_snippets/elastic-managed-llm.md b/solutions/_snippets/elastic-managed-llm.md new file mode 100644 index 0000000000..9351c19e06 --- /dev/null +++ b/solutions/_snippets/elastic-managed-llm.md @@ -0,0 +1,5 @@ +[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in the AI Assistant for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration. + +The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. + +To learn more about security and data privacy, refer to the [connector documentation](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf). diff --git a/solutions/observability/observability-ai-assistant.md b/solutions/observability/observability-ai-assistant.md index c6e95644ba..492cd7e385 100644 --- a/solutions/observability/observability-ai-assistant.md +++ b/solutions/observability/observability-ai-assistant.md @@ -95,6 +95,11 @@ The AI Assistant connects to one of these supported LLM providers: {{obs-ai-assistant}} doesn’t support connecting to a private LLM. Elastic doesn’t recommend using private LLMs with the AI Assistant. :::: +### Elastic Managed LLM [elastic-managed-llm-obs-ai-assistant] + +:::{include} ../_snippets/elastic-managed-llm.md +::: + ## Add data to the AI Assistant knowledge base [obs-ai-add-data] The AI Assistant uses [ELSER](/explore-analyze/machine-learning/nlp/ml-nlp-elser.md), Elastic’s semantic search engine, to recall data from its internal knowledge base index to create retrieval augmented generation (RAG) responses. Adding data such as Runbooks, GitHub issues, internal documentation, and Slack messages to the knowledge base gives the AI Assistant context to provide more specific assistance. diff --git a/solutions/security/ai/ai-assistant.md b/solutions/security/ai/ai-assistant.md index 71ecf238b5..ac54e723a1 100644 --- a/solutions/security/ai/ai-assistant.md +++ b/solutions/security/ai/ai-assistant.md @@ -55,11 +55,8 @@ While AI Assistant is compatible with many different models, refer to the [Large ### Elastic Managed LLM [elastic-managed-llm-security-ai-assistant] -[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in the AI Assistant for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration. - -The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. - -To learn more about security, data privacy, and early access to the Elastic Managed LLM, refer to the [connector documentation](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf). +:::{include} ../../_snippets/elastic-managed-llm.md +::: ## Start chatting [start-chatting]