Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 5 additions & 4 deletions docs/AI-for-security/connect-to-byo.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -180,10 +180,11 @@ Finally, configure the connector:
1. Log in to your Elastic deployment.
2. Navigate to **Stack Management → Connectors → Create Connector → OpenAI**. The OpenAI connector enables this use case because LM Studio uses the OpenAI SDK.
3. Name your connector to help keep track of the model version you are using.
4. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
5. Under **Default model**, enter `local-model`.
6. Under **API key**, enter the secret token specified in your Nginx configuration file.
7. Click **Save**.
4. Under **Select an OpenAI provider** select **Other (OpenAI Compatible Service)**.
5. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
6. Under **Default model**, enter `local-model`.
7. Under **API key**, enter the secret token specified in your Nginx configuration file.
8. Click **Save**.

image::images/lms-edit-connector.png[The Edit connector page in the {security-app}, with appropriate values populated]

Expand Down
Binary file modified docs/AI-for-security/images/lms-edit-connector.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
9 changes: 5 additions & 4 deletions docs/serverless/AI-for-security/connect-to-byo-llm.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -160,10 +160,11 @@ Finally, configure the connector in your Security project:
1. Log in to your Security project.
2. Navigate to **Stack Management → Connectors → Create Connector → OpenAI**. The OpenAI connector enables this use case because LM Studio uses the OpenAI SDK.
3. Name your connector to help keep track of the model version you are using.
4. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
5. Under **Default model**, enter `local-model`.
6. Under **API key**, enter the secret token specified in your Nginx configuration file.
7. Click **Save**.
4. Under **Select an OpenAI provider** select **Other (OpenAI Compatible Service)**.
5. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
6. Under **Default model**, enter `local-model`.
7. Under **API key**, enter the secret token specified in your Nginx configuration file.
8. Click **Save**.

<DocImage url="images/lms-edit-connector.png" alt="The Edit connector page in the ((security-app)), with appropriate values populated"/>

Expand Down
Binary file modified docs/serverless/AI-for-security/images/lms-edit-connector.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading