From 86e4b02aa87956f4ab058f6e93ac16363adfd131 Mon Sep 17 00:00:00 2001 From: "google-labs-jules[bot]" <161369871+google-labs-jules[bot]@users.noreply.github.com> Date: Sun, 13 Jul 2025 12:42:33 +0000 Subject: [PATCH] docs: Add LM Studio as a provider --- README.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/README.md b/README.md index 147e818..3e71505 100644 --- a/README.md +++ b/README.md @@ -109,6 +109,23 @@ You can use `shor config edit` as a shorthand to edit this file. Refer to the [LocalAI docs](https://localai.io/) for installation, available models, and usage. +### LM Studio + +To use ShellOracle with [LM Studio](https://lmstudio.ai/), you will first need to run a model and start the server. +For macOS users, it's recommended to utillize a model that is using the MLX runtime. + +Here is an example configuration: + +```toml +[shelloracle] +provider = "OpenAICompat" + +[provider.OpenAICompat] +base_url = "http://localhost:1234/v1" +api_key = "lm-studio" +model = "mistralai/devstral-small-2507" +``` + ### XAI To use ShellOracle with XAI's models, create an [API key](https://docs.x.ai/docs/quickstart#creating-an-api-key).