Skip to content

Commit 3fb4a10

Browse files
committed
[Observability] Add air-gapped links to local LLM docs
1 parent 3477459 commit 3fb4a10

File tree

1 file changed

+9
-2
lines changed

1 file changed

+9
-2
lines changed

solutions/observability/connect-to-own-local-llm.md

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ This page provides instructions for setting up a connector to a large language m
1616
::::{note}
1717
If your Elastic deployment is not on the same network, you must configure an Nginx reverse proxy to authenticate with Elastic. Refer to [Configure your reverse proxy](https://www.elastic.co/docs/solutions/security/ai/connect-to-own-local-llm#_configure_your_reverse_proxy) for more detailed instructions.
1818

19-
You do not have to set up a proxy if LM Studio is running locally, or on the same network as your Elastic deployment.
19+
You do not have to set up a proxy if LM Studio is running locally, or on the same network as your Elastic deployment.
2020
::::
2121

2222
::::{note}
@@ -85,7 +85,7 @@ Once you’ve downloaded a model, use the following commands in your CLI:
8585
4. Load a model: `lms load llama-3.3-70b-instruct --context-length 64000 --gpu max`.
8686

8787
::::{important}
88-
When loading a model, use the `--context-length` flag with a context window of 64,000 or higher.
88+
When loading a model, use the `--context-length` flag with a context window of 64,000 or higher.
8989
Optionally, you can set how much to offload to the GPU by using the `--gpu` flag. `--gpu max` will offload all layers to GPU.
9090
::::
9191

@@ -142,3 +142,10 @@ Setup is now complete. You can use the model you’ve loaded in LM Studio to pow
142142
::::{note}
143143
While local (open-weight) LLMs offer greater privacy and control, they generally do not match the raw performance and advanced reasoning capabilities of proprietary models by LLM providers mentioned in [Set up the AI Assistant](/solutions/observability/observability-ai-assistant.md#obs-ai-set-up).
144144
::::
145+
146+
## Air-gapped environments
147+
148+
Local LLMs in air-gapped environments have specific installation and configuration instructions for deploying ELSER and configuring product documentation. Refer to the following links for more information:
149+
150+
- [Deploy ELSER in an air-gapped environment](../../explore-analyze/machine-learning/nlp/ml-nlp-elser.md#air-gapped-install)
151+
- [Configure product documentation for air-gapped-environments](kibana://reference/configuration-reference/ai-assistant-settings.md#configuring-product-doc-for-airgap)

0 commit comments

Comments
 (0)