-
Notifications
You must be signed in to change notification settings - Fork 152
[Agent Builder] Add page about models #3338
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
||
### Use a pre-configured connector | ||
|
||
1. Search for **GenAI Settings** in the global search field |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: doesn't look like this be ready for 9.2?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i hope it will be 😅
alternatively Management > Gen AI settings
4. Search for **GenAI Settings** in the global search field | ||
5. Select your new connector from the **Default AI Connector** dropdown |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: doesn't look like this be ready for 9.2?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
||
### Use a pre-configured connector | ||
|
||
1. Search for **GenAI Settings** in the global search field |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i hope it will be 😅
alternatively Management > Gen AI settings
|
||
### Recommended model families | ||
|
||
The following models are known to work well with {{agent-builder}}: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@joemcelroy @abhi-elastic any other models we should recommend/ mention?
### Requirements | ||
|
||
**Model selection:** | ||
- Must include "instruct" in the model name to work with Elastic |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but open ai oss models don't include instruct in the name https://openai.com/open-models/
instruct
is better than base
or chat
but it's not a must. we should hint that it's better to use insturct
if multiple versions available
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- For self-managed deployments on the same host as your LLM: Can connect directly without a reverse proxy | ||
- Your local LLM server must use the OpenAI SDK for API compatibility | ||
|
||
### Configure the connector |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TBH it feels like we should have platform-level (solution agnostic) documentation how to setup such connectors. I see similar docs for security and o11y solutions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, but today for BYOLLM we really only have an example from security and it's kinda specific to LM Studio and nginx
. But yes shouldn't have to repeat these instructions anywhere we want to talk about local LLM use cases.
@ppf2 I know you raised concerns about this previously, do you think we should document these steps instead in the OpenAI connector docs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it makes sense to have local LLM model set up information in the OpenAI Connector docs so that different solutions can reference the same documentation. Thx!
closes https://github.com/elastic/search-team/issues/10745
closes https://github.com/elastic/docs-content-internal/issues/364