You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: solutions/observability/connect-to-own-local-llm.md
+9-2Lines changed: 9 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ This page provides instructions for setting up a connector to a large language m
16
16
::::{note}
17
17
If your Elastic deployment is not on the same network, you must configure an Nginx reverse proxy to authenticate with Elastic. Refer to [Configure your reverse proxy](https://www.elastic.co/docs/solutions/security/ai/connect-to-own-local-llm#_configure_your_reverse_proxy) for more detailed instructions.
18
18
19
-
You do not have to set up a proxy if LM Studio is running locally, or on the same network as your Elastic deployment.
19
+
You do not have to set up a proxy if LM Studio is running locally, or on the same network as your Elastic deployment.
20
20
::::
21
21
22
22
::::{note}
@@ -85,7 +85,7 @@ Once you’ve downloaded a model, use the following commands in your CLI:
When loading a model, use the `--context-length` flag with a context window of 64,000 or higher.
88
+
When loading a model, use the `--context-length` flag with a context window of 64,000 or higher.
89
89
Optionally, you can set how much to offload to the GPU by using the `--gpu` flag. `--gpu max` will offload all layers to GPU.
90
90
::::
91
91
@@ -142,3 +142,10 @@ Setup is now complete. You can use the model you’ve loaded in LM Studio to pow
142
142
::::{note}
143
143
While local (open-weight) LLMs offer greater privacy and control, they generally do not match the raw performance and advanced reasoning capabilities of proprietary models by LLM providers mentioned in [Set up the AI Assistant](/solutions/observability/observability-ai-assistant.md#obs-ai-set-up).
144
144
::::
145
+
146
+
## Air-gapped environments
147
+
148
+
Local LLMs in air-gapped environments have specific installation and configuration instructions for deploying ELSER and configuring product documentation. Refer to the following links for more information:
149
+
150
+
-[Deploy ELSER in an air-gapped environment](../../explore-analyze/machine-learning/nlp/ml-nlp-elser.md#air-gapped-install)
151
+
-[Configure product documentation for air-gapped-environments](kibana://reference/configuration-reference/ai-assistant-settings.md#configuring-product-doc-for-airgap)
0 commit comments