You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/AI-for-security/connect-to-byo.asciidoc
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -180,7 +180,7 @@ Finally, configure the connector:
180
180
1. Log in to your Elastic deployment.
181
181
2. Navigate to **Stack Management → Connectors → Create Connector → OpenAI**. The OpenAI connector enables this use case because LM Studio uses the OpenAI SDK.
182
182
3. Name your connector to help keep track of the model version you are using.
183
-
4. Under **Select an OpenAI provider** select **Other (OpenAI Compatible Service)**.
183
+
4. Under **Select an OpenAI provider**, select **Other (OpenAI Compatible Service)**.
184
184
5. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`.
185
185
6. Under **Default model**, enter `local-model`.
186
186
7. Under **API key**, enter the secret token specified in your Nginx configuration file.
0 commit comments