Skip to content

Commit 343307e

Browse files
committed
Add instructions for local/on-prem and other clouds for GUI launch
1 parent 8d53bb1 commit 343307e

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

solutions/observability/connect-to-own-local-llm.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,15 @@ LM Studio supports the OpenAI SDK, which makes it compatible with Elastic’s Op
3131

3232
As the first step, install [LM Studio](https://lmstudio.ai/).
3333

34-
You must launch the application using its GUI before being able to use the CLI. For example, Chrome RDP with an [X Window System](https://cloud.google.com/architecture/chrome-desktop-remote-on-compute-engine) can be used for this purpose. After you’ve opened the application for the first time using the GUI, you can start the server by using `sudo lms server start` in the [CLI](https://lmstudio.ai/docs/cli/server-start).
34+
You must launch the application using its GUI before being able to use the CLI.
35+
36+
::::{note}
37+
For local/on‑prem desktop: Launch LM studio directly.
38+
For GCP, Chrome RDP with an [X Window System](https://cloud.google.com/architecture/chrome-desktop-remote-on-compute-engine) can be used for this purpose.
39+
For other cloud platforms: Any secure remote desktop (RDP, VNC over SSH tunnel, or X11 forwarding) works as long as you can open the LM Studio GUI once.
40+
::::
41+
42+
After you’ve opened the application for the first time using the GUI, you can start the server by using `sudo lms server start` in the [CLI](https://lmstudio.ai/docs/cli/server-start).
3543

3644
Once you’ve launched LM Studio:
3745

0 commit comments

Comments
 (0)