Skip to content

Commit 887b702

Browse files
authored
Merge pull request #1291 from MikeBirdTech/doc-updates
Local 2 Doc updates
2 parents 1b7883a + 28012ca commit 887b702

File tree

10 files changed

+264
-98
lines changed

10 files changed

+264
-98
lines changed

docs/getting-started/introduction.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This provides a natural-language interface to your computer's general-purpose ca
2020

2121
<br/>
2222

23-
<Info>You can also build Open Interpreter into your applications with [our new Python package.](/usage/python/arguments)</Info>
23+
<Info>You can also build Open Interpreter into your applications with [our Python package.](/usage/python/arguments)</Info>
2424

2525
---
2626

@@ -41,4 +41,4 @@ interpreter
4141
</Step>
4242
</Steps>
4343

44-
We've also developed [one-line installers](setup) that install Python and set up Open Interpreter.
44+
We've also developed [one-line installers](/getting-started/setup#experimental-one-line-installers) that install Python and set up Open Interpreter.

docs/getting-started/setup.mdx

Lines changed: 42 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -2,69 +2,75 @@
22
title: Setup
33
---
44

5-
## Experimental one-line installers
6-
7-
To try our experimental installers, open your Terminal with admin privileges [(click here to learn how)](https://chat.openai.com/share/66672c0f-0935-4c16-ac96-75c1afe14fe3), then paste the following commands:
5+
## Installation from `pip`
86

9-
<CodeGroup>
7+
If you are familiar with Python, we recommend installing Open Interpreter via `pip`
108

11-
```bash Mac
12-
curl -sL https://raw.githubusercontent.com/KillianLucas/open-interpreter/main/installers/oi-mac-installer.sh | bash
9+
```bash
10+
pip install open-interpreter
1311
```
1412

15-
```powershell Windows
16-
iex "& {$(irm https://raw.githubusercontent.com/KillianLucas/open-interpreter/main/installers/oi-windows-installer.ps1)}"
17-
```
13+
<Info>
14+
You'll need Python
15+
[3.10](https://www.python.org/downloads/release/python-3100/) or
16+
[3.11](https://www.python.org/downloads/release/python-3110/). Run `python
17+
--version` to check yours.
1818

19-
```bash Linux
20-
curl -sL https://raw.githubusercontent.com/KillianLucas/open-interpreter/main/installers/oi-linux-installer.sh | bash
21-
```
19+
It is recommended to install Open Interpreter in a [virtual
20+
environment](https://docs.python.org/3/library/venv.html).
2221

23-
</CodeGroup>
22+
</Info>
2423

25-
These installers will attempt to download Python, set up an environment, and install Open Interpreter for you.
24+
## Install optional dependencies from `pip`
2625

27-
## Terminal usage
26+
Open Interpreter has optional dependencies for different capabilities
2827

29-
After installation, you can start an interactive chat in your terminal by running:
28+
[Local Mode](/guides/running-locally) dependencies
3029

3130
```bash
32-
interpreter
31+
pip install open-interpreter[local]
3332
```
3433

35-
## Installation from `pip`
34+
[OS Mode](/guides/os-mode) dependencies
35+
36+
```bash
37+
pip install open-interpreter[os]
38+
```
3639

37-
If you already use Python, we recommend installing Open Interpreter via `pip`:
40+
[Safe Mode](/safety/safe-mode) dependencies
3841

3942
```bash
40-
pip install open-interpreter
43+
pip install open-interpreter[safe]
4144
```
4245

43-
<Info>
44-
**Note:** You'll need Python
45-
[3.10](https://www.python.org/downloads/release/python-3100/) or
46-
[3.11](https://www.python.org/downloads/release/python-3110/). Run `python
47-
--version` to check yours.
48-
</Info>
46+
Server dependencies
47+
48+
```bash
49+
pip install open-interpreter[server]
50+
```
4951

50-
## Python usage
52+
## Experimental one-line installers
5153

52-
To start an interactive chat in Python, run the following:
54+
To try our experimental installers, open your Terminal with admin privileges [(click here to learn how)](https://chat.openai.com/share/66672c0f-0935-4c16-ac96-75c1afe14fe3), then paste the following commands:
5355

54-
```python
55-
from interpreter import interpreter
56+
<CodeGroup>
5657

57-
interpreter.chat()
58+
```bash Mac
59+
curl -sL https://raw.githubusercontent.com/KillianLucas/open-interpreter/main/installers/oi-mac-installer.sh | bash
5860
```
5961

60-
You can also pass messages to `interpreter` programmatically:
62+
```powershell Windows
63+
iex "& {$(irm https://raw.githubusercontent.com/KillianLucas/open-interpreter/main/installers/oi-windows-installer.ps1)}"
64+
```
6165

62-
```python
63-
interpreter.chat("Get the last 5 BBC news headlines.")
66+
```bash Linux
67+
curl -sL https://raw.githubusercontent.com/KillianLucas/open-interpreter/main/installers/oi-linux-installer.sh | bash
6468
```
6569

66-
[Click here](/usage/python/streaming-response) to learn how to stream its response into your application.
70+
</CodeGroup>
71+
72+
These installers will attempt to download Python, set up an environment, and install Open Interpreter for you.
6773

6874
## No Installation
6975

70-
If configuring your computer environment is challenging, you can press the `,` key on this repository's GitHub page to create a codespace. After a moment, you'll receive a cloud virtual machine environment pre-installed with open-interpreter. You can then start interacting with it directly and freely confirm its execution of system commands without worrying about damaging the system.
76+
If configuring your computer environment is challenging, you can press the `,` key on the [GitHub page](https://github.com/OpenInterpreter/open-interpreter) to create a codespace. After a moment, you'll receive a cloud virtual machine environment pre-installed with open-interpreter. You can then start interacting with it directly and freely confirm its execution of system commands without worrying about damaging the system.

docs/guides/profiles.mdx

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
---
2+
title: Profiles
3+
---
4+
5+
Profiles are a powerful way to customize your instance of Open Interpreter.
6+
7+
Profiles are Python files that configure Open Interpreter. A wide range of fields from the [model](/settings/all-settings#model-selection) to the [context window](/settings/all-settings#context-window) to the [message templates](/settings/all-settings#user-message-template) can be configured in a Profile. This allows you to save multiple variations of Open Interpreter to optimize for your specific use-cases.
8+
9+
You can access your Profiles by running `interpreter --profiles`. This will open the directory where all of your Profiles are stored.
10+
11+
To apply a Profile to an Open Interpreter session, you can run `interpreter --profile <name>`
12+
13+
# Example Profile
14+
15+
```Python
16+
from interpreter import interpreter
17+
18+
interpreter.os = True
19+
interpreter.llm.supports_vision = True
20+
21+
interpreter.llm.model = "gpt-4-vision-preview"
22+
23+
interpreter.llm.supports_functions = False
24+
interpreter.llm.context_window = 110000
25+
interpreter.llm.max_tokens = 4096
26+
interpreter.auto_run = True
27+
interpreter.force_task_completion = True
28+
```
29+
30+
<Tip>
31+
There are many settings that can be configured. [See them all
32+
here](/settings/all-settings)
33+
</Tip>
34+
35+
## Helpful settings for local models
36+
37+
Local models benefit from more coersion and guidance. This verbosity of adding extra context to messages can impact the conversational experience of Open Interpreter. The following settings allow templates to be applied to messages to improve the steering of the language model while maintaining the natural flow of conversation.
38+
39+
`interpreter.user_message_template` allows users to have their message wrapped in a template. This can be helpful steering a language model to a desired behaviour without needing the user to add extra context to their message.
40+
41+
`interpreter.always_apply_user_message_template` has all user messages to be wrapped in the template. If False, only the last User message will be wrapped.
42+
43+
`interpreter.code_output_template` wraps the output from the computer after code is run. This can help with nudging the language model to continue working or to explain outputs.
44+
45+
`interpreter.empty_code_output_template` is the message that is sent to the language model if code execution results in no output.

docs/guides/running-locally.mdx

Lines changed: 42 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -2,40 +2,54 @@
22
title: Running Locally
33
---
44

5-
In this video, Mike Bird goes over three different methods for running Open Interpreter with a local language model:
5+
Open Interpreter can be run fully locally.
66

7-
<iframe width="560" height="315" src="https://www.youtube.com/embed/CEs51hGWuGU?si=cN7f6QhfT4edfG5H" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
7+
Users need to install software to run local LLMs. Open Interpreter supports multiple local model providers such as [Ollama](https://www.ollama.com/), [Llamafile](https://github.com/Mozilla-Ocho/llamafile), [Jan](https://jan.ai/), and [LM Studio](https://lmstudio.ai/).
88

9-
## How to Use Open Interpreter Locally
9+
<Tip>
10+
Local models perform better with extra guidance and direction. You can improve
11+
performance for your use-case by creating a new [Profile](/guides/profiles).
12+
</Tip>
1013

11-
### Ollama
14+
## Terminal Usage
1215

13-
1. Download Ollama from https://ollama.ai/download
14-
2. Run the command:
15-
`ollama run dolphin-mixtral:8x7b-v2.6`
16-
3. Execute the Open Interpreter:
17-
`interpreter --model ollama/dolphin-mixtral:8x7b-v2.6`
16+
### Local Explorer
1817

19-
### Jan.ai
18+
A Local Explorer was created to simplify the process of using OI locally. To access this menu, run the command `interpreter --local`.
2019

21-
1. Download Jan from http://jan.ai
22-
2. Download the model from the Hub
23-
3. Enable API server:
24-
1. Go to Settings
25-
2. Navigate to Advanced
26-
3. Enable API server
27-
4. Select the model to use
28-
5. Run the Open Interpreter with the specified API base:
29-
`interpreter --api_base http://localhost:1337/v1 --model mixtral-8x7b-instruct`
20+
Select your chosen local model provider from the list of options.
3021

31-
### Llamafile
22+
Most providers will require the user to state the model they are using. Provider specific instructions are shown to the user in the menu.
3223

33-
⚠ Ensure that Xcode is installed for Apple Silicon
24+
### Custom Local
3425

35-
1. Download or create a llamafile from https://github.com/Mozilla-Ocho/llamafile
36-
2. Make the llamafile executable:
37-
`chmod +x mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile`
38-
3. Execute the llamafile:
39-
`./mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile`
40-
4. Run the interpreter with the specified API base:
41-
`interpreter --api_base https://localhost:8080/v1`
26+
If you want to use a provider other than the ones listed, you will set the `--api_base` flag to set a [custom endpoint](/language-models/local-models/custom-endpoint).
27+
28+
You will also need to set the model by passing in the `--model` flag to select a [model](/settings/all-settings#model-selection).
29+
30+
```python
31+
interpreter --api_base "http://localhost:11434" --model ollama/codestral
32+
```
33+
34+
<Info>
35+
Other terminal flags are explained in [Settings](/settings/all-settings).
36+
</Info>
37+
38+
## Python Usage
39+
40+
In order to have a Python script use Open Interpreter locally, some fields need to be set
41+
42+
```python
43+
from interpreter import interpreter
44+
45+
interpreter.offline = True
46+
interpreter.llm.model = "ollama/codestral"
47+
interpreter.llm.api_base = "http://localhost:11434"
48+
49+
interpreter.chat("how many files are on my desktop?")
50+
```
51+
52+
<Info>
53+
Other congifuration settings are explained in
54+
[Settings](/settings/all-settings).
55+
</Info>

docs/language-models/hosted-models/openai.mdx

Lines changed: 3 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -18,12 +18,7 @@ interpreter.chat()
1818

1919
</CodeGroup>
2020

21-
This will default to `gpt-4`, which is the most capable publicly available model for code interpretation (Open Interpreter was designed to be used with `gpt-4`).
22-
23-
<Info>
24-
Trouble accessing `gpt-4`? Read our [gpt-4 setup
25-
article](/language-model-setup/hosted-models/gpt-4-setup).
26-
</Info>
21+
This will default to `gpt-4-turbo`, which is the most capable publicly available model for code interpretation (Open Interpreter was designed to be used with `gpt-4`).
2722

2823
To run a specific model from OpenAI, set the `model` flag:
2924

@@ -49,17 +44,11 @@ We support any model on [OpenAI's models page:](https://platform.openai.com/docs
4944
<CodeGroup>
5045

5146
```bash Terminal
52-
interpreter --model gpt-4
53-
interpreter --model gpt-4-32k
54-
interpreter --model gpt-3.5-turbo
55-
interpreter --model gpt-3.5-turbo-16k
47+
interpreter --model gpt-4o
5648
```
5749

5850
```python Python
59-
interpreter.llm.model = "gpt-4"
60-
interpreter.llm.model = "gpt-4-32k"
61-
interpreter.llm.model = "gpt-3.5-turbo"
62-
interpreter.llm.model = "gpt-3.5-turbo-16k"
51+
interpreter.llm.model = "gpt-4o"
6352
```
6453

6554
</CodeGroup>

docs/language-models/local-models/ollama.mdx

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,3 +37,8 @@ interpreter.chat()
3737
</CodeGroup>
3838

3939
For any future runs with Ollama, ensure that the Ollama server is running. If using the desktop application, you can check to see if the Ollama menu bar item is active.
40+
41+
<Warning>
42+
If Ollama is producing strange output, make sure to update to the latest
43+
version
44+
</Warning>

docs/language-models/settings.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,4 +4,4 @@ title: Settings
44

55
The `interpreter.llm` is responsible for running the language model.
66

7-
[Click here to view `interpreter.llm` settings.](https://docs.openinterpreter.com/settings/all-settings#language-model)
7+
[Click here to view `interpreter.llm` settings.](/settings/all-settings#language-model)

docs/language-models/usage.mdx

Lines changed: 0 additions & 5 deletions
This file was deleted.

docs/mint.json

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -20,13 +20,13 @@
2020
},
2121
"topbarLinks": [
2222
{
23-
"name": "39K ★ GitHub",
24-
"url": "https://github.com/KillianLucas/open-interpreter"
23+
"name": "50K ★ GitHub",
24+
"url": "https://github.com/OpenInterpreter/open-interpreter"
2525
}
2626
],
2727
"topbarCtaButton": {
2828
"name": "Join Discord",
29-
"url": "https://discord.com/invite/6p3fD6rBVm"
29+
"url": "https://discord.gg/Hvz9Axh84z"
3030
},
3131
"navigation": [
3232
{
@@ -38,6 +38,7 @@
3838
"pages": [
3939
"guides/basic-usage",
4040
"guides/running-locally",
41+
"guides/profiles",
4142
"guides/streaming-response",
4243
"guides/advanced-terminal-usage",
4344
"guides/multiple-instances",
@@ -46,9 +47,7 @@
4647
},
4748
{
4849
"group": "Settings",
49-
"pages": [
50-
"settings/all-settings"
51-
]
50+
"pages": ["settings/all-settings"]
5251
},
5352
{
5453
"group": "Language Models",
@@ -83,17 +82,16 @@
8382
{
8483
"group": "Local Providers",
8584
"pages": [
86-
"language-models/local-models/lm-studio",
85+
"language-models/local-models/ollama",
8786
"language-models/local-models/llamafile",
8887
"language-models/local-models/janai",
89-
"language-models/local-models/ollama",
88+
"language-models/local-models/lm-studio",
9089
"language-models/local-models/custom-endpoint",
9190
"language-models/local-models/best-practices"
9291
]
9392
},
9493
"language-models/custom-models",
95-
"language-models/settings",
96-
"language-models/usage"
94+
"language-models/settings"
9795
]
9896
},
9997
{
@@ -131,6 +129,8 @@
131129
"suggestEdit": true
132130
},
133131
"footerSocials": {
134-
"twitter": "https://twitter.com/hellokillian"
132+
"twitter": "https://x.com/OpenInterpreter",
133+
"youtube": "https://www.youtube.com/@OpenInterpreter",
134+
"linkedin": "https://www.linkedin.com/company/openinterpreter"
135135
}
136136
}

0 commit comments

Comments
 (0)