Skip to content

Commit 3d18f62

Browse files
BYO LLM 8.16 updates (#5967)
* BYO LLM 8.16 updates * Update docs/AI-for-security/connect-to-byo.asciidoc Co-authored-by: Nastasha Solomon <[email protected]> --------- Co-authored-by: Nastasha Solomon <[email protected]>
1 parent b0e7e61 commit 3d18f62

File tree

4 files changed

+10
-8
lines changed

4 files changed

+10
-8
lines changed

docs/AI-for-security/connect-to-byo.asciidoc

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -180,10 +180,11 @@ Finally, configure the connector:
180180
1. Log in to your Elastic deployment.
181181
2. Navigate to **Stack Management → Connectors → Create Connector → OpenAI**. The OpenAI connector enables this use case because LM Studio uses the OpenAI SDK.
182182
3. Name your connector to help keep track of the model version you are using.
183-
4. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
184-
5. Under **Default model**, enter `local-model`.
185-
6. Under **API key**, enter the secret token specified in your Nginx configuration file.
186-
7. Click **Save**.
183+
4. Under **Select an OpenAI provider**, select **Other (OpenAI Compatible Service)**.
184+
5. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
185+
6. Under **Default model**, enter `local-model`.
186+
7. Under **API key**, enter the secret token specified in your Nginx configuration file.
187+
8. Click **Save**.
187188

188189
image::images/lms-edit-connector.png[The Edit connector page in the {security-app}, with appropriate values populated]
189190

23.7 KB
Loading

docs/serverless/AI-for-security/connect-to-byo-llm.mdx

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -160,10 +160,11 @@ Finally, configure the connector in your Security project:
160160
1. Log in to your Security project.
161161
2. Navigate to **Stack Management → Connectors → Create Connector → OpenAI**. The OpenAI connector enables this use case because LM Studio uses the OpenAI SDK.
162162
3. Name your connector to help keep track of the model version you are using.
163-
4. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
164-
5. Under **Default model**, enter `local-model`.
165-
6. Under **API key**, enter the secret token specified in your Nginx configuration file.
166-
7. Click **Save**.
163+
4. Under **Select an OpenAI provider**, select **Other (OpenAI Compatible Service)**.
164+
5. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
165+
6. Under **Default model**, enter `local-model`.
166+
7. Under **API key**, enter the secret token specified in your Nginx configuration file.
167+
8. Click **Save**.
167168

168169
<DocImage url="images/lms-edit-connector.png" alt="The Edit connector page in the ((security-app)), with appropriate values populated"/>
169170

23.7 KB
Loading

0 commit comments

Comments
 (0)