-
Notifications
You must be signed in to change notification settings - Fork 181
Updates AI Assistant reqs for Security, Obs/Search, Agent Builder #3967
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 3 commits
fd5e90d
035f328
d0b8b04
1667e60
624401e
a7a1670
60e7792
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -32,26 +32,18 @@ The {{obs-ai-assistant}} helps you: | |
|
|
||
| ## Requirements [obs-ai-requirements] | ||
|
|
||
| The AI assistant requires the following: | ||
| To set up or use AI assistant, you need the following: | ||
|
|
||
| - An **Elastic deployment**: | ||
| * An appropriate [Elastic license](https://www.elastic.co/subscriptions). | ||
|
|
||
| - For **{{observability}}**: {{stack}} version **8.9** or later, or an **{{observability}} serverless project**. | ||
| * The `Observability AI Assistant: All` {{kib}} privilege. | ||
benironside marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| - For **Search**: {{stack}} version **8.16.0** or later, or **{{serverless-short}} {{es}} project**. | ||
| * An [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md). | ||
benironside marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| - To run {{obs-ai-assistant}} on a self-hosted Elastic stack, you need an [appropriate license](https://www.elastic.co/subscriptions). | ||
|
|
||
| - An account with a third-party generative AI provider that preferably supports function calling. If your AI provider does not support function calling, you can configure [AI Assistant settings](../../solutions/observability/observability-ai-assistant.md#obs-ai-settings) to simulate function calling, but this might affect performance. | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I know function calling is something used in Observability that Security doesn't use. We may want to keep a note that recommends a model that supports function calling or to configure simulated function calling. I would only think this wouldn't be necessary if all of the models we have connectors for support function calling. @viduni94 maybe you would have some insight into this?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good catch, thanks. Curious to hear further input but sounds like adding a note makes sense.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. reached out to Viduni directly
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. After my conversation with Viduni, I don't think we need to add a note. Recent proprietary models all support function calling, and if the selected model doesn't support function calling, we use simulated function calling so assistant still works. |
||
|
|
||
| - The free tier offered by third-party generative AI provider may not be sufficient for the proper functioning of the AI assistant. In most cases, a paid subscription to one of the supported providers is required. | ||
|
|
||
| Refer to the [documentation](kibana://reference/connectors-kibana/gen-ai-connectors.md) for your provider to learn about supported and default models. | ||
|
|
||
| * The knowledge base requires a 4 GB {{ml}} node. | ||
| * To use knowledge base: a 4 GB {{ml}} node. | ||
benironside marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| - In {{ecloud}} or {{ece}}, if you have Machine Learning autoscaling enabled, Machine Learning nodes will be started when using the knowledge base and AI Assistant. Therefore using these features will incur additional costs. | ||
|
|
||
| * A self-deployed connector service if you're using [content connectors](elasticsearch://reference/search-connectors/index.md) to populate external data into the knowledge base. | ||
| * To use [content connectors](elasticsearch://reference/search-connectors/index.md) to add external data to knowledge base: A self-deployed connector service. | ||
benironside marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| ## Manage access to AI Assistant | ||
|
|
||
|
|
||
Uh oh!
There was an error while loading. Please reload this page.