You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1.**Open Roo Code Settings:** Click the gear icon (⚙️) in the Roo Code panel.
23
+
1.**Open Roo Code Settings:** Click the gear icon (<Codiconname="gear" />) in the Roo Code panel.
24
24
2.**Select Provider:** Choose "LM Studio" from the "API Provider" dropdown.
25
25
3.**Enter Model ID:** Enter the *file name* of the model you loaded in LM Studio (e.g., `codellama-7b.Q4_0.gguf`). You can find this in the LM Studio "Local Server" tab.
26
26
4.**(Optional) Base URL:** By default, Roo Code will connect to LM Studio at `http://localhost:1234`. If you've configured LM Studio to use a different address or port, enter the full URL here.
***Model Selection:** LM Studio provides a wide range of models. Experiment to find the one that best suits your needs.
32
32
***Local Server:** The LM Studio local server must be running for Roo Code to connect to it.
33
33
***LM Studio Documentation:** Refer to the [LM Studio documentation](https://lmstudio.ai/docs) for more information.
34
-
***Troubleshooting:** If you see a "Please check the LM Studio developer logs to debug what went wrong" error, you may need to adjust the context length settings in LM Studio.
34
+
***Troubleshooting:** If you see a "Please check the LM Studio developer logs to debug what went wrong" error, you may need to adjust the context length settings in LM Studio.
0 commit comments