You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/guides/running-locally.mdx
+39-12Lines changed: 39 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,25 +4,52 @@ title: Running Locally
4
4
5
5
Open Interpreter can be run fully locally.
6
6
7
-
Users need to install software to run local LLMs. Open Interpreter supports multiple local model providers such as [Ollama](https://www.ollama.com/), [Llamafile](https://github.com/Mozilla-Ocho/llamafile), [LM Studio](https://lmstudio.ai/), and [Jan](https://jan.ai/).
7
+
Users need to install software to run local LLMs. Open Interpreter supports multiple local model providers such as [Ollama](https://www.ollama.com/), [Llamafile](https://github.com/Mozilla-Ocho/llamafile), [Jan](https://jan.ai/), and [LM Studio](https://lmstudio.ai/).
8
8
9
-
## Local Setup Menu
9
+
<Tip>
10
+
Local models perform better with extra guidance and direction. You can improve
11
+
performance for your use-case by creating a new [Profile](/guides/profiles).
12
+
</Tip>
10
13
11
-
A Local Setup Menu was created to simplify the process of using OI locally. To access this menu, run the command `interpreter --local`.
14
+
## Terminal Usage
12
15
13
-
### Provider
16
+
### Local Setup Menu
17
+
18
+
A Local Setup Menu was created to simplify the process of using OI locally. To access this menu, run the command `interpreter --local`.
14
19
15
20
Select your chosen local model provider from the list of options.
16
21
17
-
It is possible to use a provider other than the ones listed. Instead of running `--local` you will set the `--api_base` flag to set a [custom endpoint](/language-models/local-models/custom-endpoint).
22
+
Most providers will require the user to state the model they are using. Provider specific instructions are shown to the user in the menu.
18
23
19
-
### Model
24
+
### Custom Local
20
25
21
-
Most providers will require the user to state the model they are using. There are Provider specific instructions shown to the user.
26
+
If you want to use a provider other than the ones listed, you will set the `--api_base` flag to set a [custom endpoint](/language-models/local-models/custom-endpoint).
22
27
23
-
It is possible to set the model without going through the Local Setup Menu by setting the `--model` flag to select a [model](/settings/all-settings#model-selection).
28
+
You will also need to set the model by passing in the `--model` flag to select a [model](/settings/all-settings#model-selection).
24
29
25
-
<Tip>
26
-
Local models perform better with extra guidance and direction. You can improve
27
-
performance for your use-case by creating a new [Profile](/guides/profiles).
0 commit comments