Skip to content

Commit 8d53bb1

Browse files
committed
Add a section to jump to connector creation
1 parent 3f3ae83 commit 8d53bb1

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

solutions/observability/connect-to-own-local-llm.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,10 @@ You do not have to set up a proxy if LM studio is configured on the same network
1919
If your Elastic deployment is not on the same network, you would need to configure a reverse proxy using Nginx to authenticate with Elastic. Refer [Configure your reverse proxy](https://www.elastic.co/docs/solutions/security/ai/connect-to-own-local-llm#_configure_your_reverse_proxy) for more detailed instructions.
2020
::::
2121

22+
### Already running LM Studio? [_skip_if_already_running]
23+
24+
If LM Studio is already installed, the server is running, and you have a model loaded (with a context window of at least 64K tokens), you can skip directly to [Configure the connector in your Elastic deployment](#_configure_the_connector_in_your_elastic_deployment).
25+
2226
This example uses a server hosted in GCP to configure LM Studio with the [Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407) model.
2327

2428
## Configure LM Studio and download a model [_configure_lm_studio_and_download_a_model]

0 commit comments

Comments
 (0)