Skip to content

Conversation

leemthompo
Copy link
Contributor

Copy link

github-actions bot commented Oct 6, 2025

@leemthompo leemthompo requested a review from a team October 6, 2025 13:29

### Use a pre-configured connector

1. Search for **GenAI Settings** in the global search field
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: doesn't look like this be ready for 9.2?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i hope it will be 😅

alternatively Management > Gen AI settings

Comment on lines +42 to +43
4. Search for **GenAI Settings** in the global search field
5. Select your new connector from the **Default AI Connector** dropdown
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: doesn't look like this be ready for 9.2?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


### Use a pre-configured connector

1. Search for **GenAI Settings** in the global search field
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i hope it will be 😅

alternatively Management > Gen AI settings


### Recommended model families

The following models are known to work well with {{agent-builder}}:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@joemcelroy @abhi-elastic any other models we should recommend/ mention?

### Requirements

**Model selection:**
- Must include "instruct" in the model name to work with Elastic
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but open ai oss models don't include instruct in the name https://openai.com/open-models/

instruct is better than base or chat but it's not a must. we should hint that it's better to use insturct if multiple versions available

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

- For self-managed deployments on the same host as your LLM: Can connect directly without a reverse proxy
- Your local LLM server must use the OpenAI SDK for API compatibility

### Configure the connector
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TBH it feels like we should have platform-level (solution agnostic) documentation how to setup such connectors. I see similar docs for security and o11y solutions

Copy link
Contributor Author

@leemthompo leemthompo Oct 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but today for BYOLLM we really only have an example from security and it's kinda specific to LM Studio and nginx. But yes shouldn't have to repeat these instructions anywhere we want to talk about local LLM use cases.

@ppf2 I know you raised concerns about this previously, do you think we should document these steps instead in the OpenAI connector docs?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it makes sense to have local LLM model set up information in the OpenAI Connector docs so that different solutions can reference the same documentation. Thx!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants