Skip to content

Commit 0f16fa1

Browse files
committed
Update instructions
1 parent 343307e commit 0f16fa1

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

solutions/observability/connect-to-own-local-llm.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,11 +19,11 @@ You do not have to set up a proxy if LM studio is configured on the same network
1919
If your Elastic deployment is not on the same network, you would need to configure a reverse proxy using Nginx to authenticate with Elastic. Refer [Configure your reverse proxy](https://www.elastic.co/docs/solutions/security/ai/connect-to-own-local-llm#_configure_your_reverse_proxy) for more detailed instructions.
2020
::::
2121

22-
### Already running LM Studio? [_skip_if_already_running]
22+
This example uses a server hosted in GCP to configure LM Studio with the [Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407) model.
2323

24-
If LM Studio is already installed, the server is running, and you have a model loaded (with a context window of at least 64K tokens), you can skip directly to [Configure the connector in your Elastic deployment](#_configure_the_connector_in_your_elastic_deployment).
24+
### Already running LM Studio? [_skip_if_already_running]
2525

26-
This example uses a server hosted in GCP to configure LM Studio with the [Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407) model.
26+
If LM Studio is already installed, the server is running, and you have a model loaded (with a context window of at least 64K tokens), you can skip directly to [Configure the connector in your Elastic deployment](#configure-the-connector-in-your-elastic-deployment-_configure_the_connector_in_your_elastic_deployment).
2727

2828
## Configure LM Studio and download a model [_configure_lm_studio_and_download_a_model]
2929

0 commit comments

Comments
 (0)