Skip to content

Commit d3eb01b

Browse files
authored
Merge branch 'main' into nof-unction-calling
2 parents 538714f + a3d1457 commit d3eb01b

File tree

18 files changed

+814
-613
lines changed

18 files changed

+814
-613
lines changed

README.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,6 @@
99
<img src="https://img.shields.io/static/v1?label=license&message=AGPL&color=white&style=flat" alt="License"/>
1010
<br>
1111
<br>
12-
<strong>We launched a new computer (the 01) with Open Interpreter at the center. <a href="https://github.com/OpenInterpreter/01">Star the repo →</a></strong><br>
1312
<br><a href="https://0ggfznkwh4j.typeform.com/to/G21i9lJ2">Get early access to the desktop app</a>‎ ‎ |‎ ‎ <a href="https://docs.openinterpreter.com/">Documentation</a><br>
1413
</p>
1514

@@ -18,9 +17,9 @@
1817
![poster](https://github.com/KillianLucas/open-interpreter/assets/63927363/08f0d493-956b-4d49-982e-67d4b20c4b56)
1918

2019
<br>
21-
<!--<p align="center">
22-
<strong>The New Computer Update</strong> introduces <strong><code>--os</code></strong> and a new <strong>Computer API</strong>. <a href="https://changes.openinterpreter.com/log/the-new-computer-update">Read On →</a>
23-
</p>-->
20+
<p align="center">
21+
<strong>The New Computer Update</strong> introduced <strong><code>--os</code></strong> and a new <strong>Computer API</strong>. <a href="https://changes.openinterpreter.com/log/the-new-computer-update">Read On →</a>
22+
</p>
2423
<br>
2524

2625
```shell

docs/NCU_MIGRATION_GUIDE.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ All stateless LLM attributes have been moved to `interpreter.llm`:
3838
- `interpreter.api_version``interpreter.llm.api_version`
3939
- `interpreter.api_base``interpreter.llm.api_base`
4040

41-
This is reflected **1)** in Python applications using Open Interpreter and **2)** in your configuration file for OI's terminal interface, which can be edited via `interpreter --config`.
41+
This is reflected **1)** in Python applications using Open Interpreter and **2)** in your profile for OI's terminal interface, which can be edited via `interpreter --profiles`.
4242

4343
## New Static Messages Structure
4444

docs/guides/basic-usage.mdx

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22
title: Basic Usage
33
---
44

5-
65
<CardGroup>
76

87
<Card
@@ -102,15 +101,19 @@ interpreter.messages = messages
102101

103102
### Configure Default Settings
104103

105-
We save default settings to a configuration file which can be edited by running the following command:
104+
We save default settings to the `default.yaml` profile which can be opened and edited by running the following command:
106105

107106
```shell
108-
interpreter --config
107+
interpreter --profiles
109108
```
110109

111110
You can use this to set your default language model, system message (custom instructions), max budget, etc.
112111

113-
<Info>**Note:** The Python library will also inherit settings from this config file, but you can only change it by running `interpreter --config` or navigating to `<your application directory>/Open Interpreter/config.yaml` and editing it manually.</Info>
112+
<Info>
113+
**Note:** The Python library will also inherit settings from the default
114+
profile file. You can change it by running `interpreter --profiles` and
115+
editing `default.yaml`.
116+
</Info>
114117

115118
---
116119

docs/guides/running-locally.mdx

Lines changed: 24 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -6,30 +6,36 @@ In this video, Mike Bird goes over three different methods for running Open Inte
66

77
<iframe width="560" height="315" src="https://www.youtube.com/embed/CEs51hGWuGU?si=cN7f6QhfT4edfG5H" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
88

9-
## How to use Open Interpreter locally
9+
## How to Use Open Interpreter Locally
1010

1111
### Ollama
1212

13-
1. Download Ollama - https://ollama.ai/download
14-
2. `ollama run dolphin-mixtral:8x7b-v2.6`
15-
3. `interpreter --model ollama/dolphin-mixtral:8x7b-v2.6`
13+
1. Download Ollama from https://ollama.ai/download
14+
2. Run the command:
15+
`ollama run dolphin-mixtral:8x7b-v2.6`
16+
3. Execute the Open Interpreter:
17+
`interpreter --model ollama/dolphin-mixtral:8x7b-v2.6`
1618

17-
# Jan.ai
19+
### Jan.ai
1820

19-
1. Download Jan - [Jan.ai](http://jan.ai/)
20-
2. Download model from Hub
21-
3. Enable API server
22-
1. Settings
23-
2. Advanced
21+
1. Download Jan from http://jan.ai
22+
2. Download the model from the Hub
23+
3. Enable API server:
24+
1. Go to Settings
25+
2. Navigate to Advanced
2426
3. Enable API server
25-
4. Select Model to use
26-
5. `interpreter --api_base http://localhost:1337/v1 --model mixtral-8x7b-instruct`
27+
4. Select the model to use
28+
5. Run the Open Interpreter with the specified API base:
29+
`interpreter --api_base http://localhost:1337/v1 --model mixtral-8x7b-instruct`
2730

28-
# llamafile
31+
### Llamafile
2932

30-
1. Download or make a llamafile - https://github.com/Mozilla-Ocho/llamafile
31-
2. `chmod +x mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile`
32-
3. `./mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile`
33-
4. `interpreter --api_base https://localhost:8080/v1`
33+
⚠ Ensure that Xcode is installed for Apple Silicon
3434

35-
Make sure that Xcode is installed for Apple Silicon
35+
1. Download or create a llamafile from https://github.com/Mozilla-Ocho/llamafile
36+
2. Make the llamafile executable:
37+
`chmod +x mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile`
38+
3. Execute the llamafile:
39+
`./mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile`
40+
4. Run the interpreter with the specified API base:
41+
`interpreter --api_base https://localhost:8080/v1`

docs/telemetry/telemetry.mdx

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ Use the `--disable_telemetry` flag:
2727
interpreter --disable_telemetry
2828
```
2929

30-
### Configuration File
30+
### Profile
3131

3232
Set `disable_telemetry` to `true`. This will persist to future terminal sessions:
3333

@@ -61,4 +61,8 @@ To view the list of events we track, you may reference the **[code](https://gith
6161

6262
We use **[Posthog](https://posthog.com/)** to store and visualize telemetry data.
6363

64-
<Info>Posthog is an open source platform for product analytics. Learn more about Posthog on **[posthog.com](https://posthog.com/)** or **[github.com/posthog](https://github.com/posthog/posthog)**</Info>
64+
<Info>
65+
Posthog is an open source platform for product analytics. Learn more about
66+
Posthog on **[posthog.com](https://posthog.com/)** or
67+
**[github.com/posthog](https://github.com/posthog/posthog)**
68+
</Info>

docs/usage/examples.mdx

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -102,15 +102,19 @@ interpreter.messages = messages
102102

103103
### Configure Default Settings
104104

105-
We save default settings to a configuration file which can be edited by running the following command:
105+
We save default settings to a profile which can be edited by running the following command:
106106

107107
```shell
108-
interpreter --config
108+
interpreter --profiles
109109
```
110110

111111
You can use this to set your default language model, system message (custom instructions), max budget, etc.
112112

113-
<Info>**Note:** The Python library will also inherit settings from this config file, but you can only change it by running `interpreter --config` or navigating to `<your application directory>/Open Interpreter/config.yaml` and editing it manually.</Info>
113+
<Info>
114+
**Note:** The Python library will also inherit settings from the default
115+
profile file. You can change it by running `interpreter --profiles` and
116+
editing `default.yaml`.
117+
</Info>
114118

115119
---
116120

@@ -147,4 +151,4 @@ In Python, set the model on the object:
147151
interpreter.llm.model = "gpt-3.5-turbo"
148152
```
149153

150-
[Find the appropriate "model" string for your language model here.](https://docs.litellm.ai/docs/providers/)
154+
[Find the appropriate "model" string for your language model here.](https://docs.litellm.ai/docs/providers/)

docs/usage/python/settings.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22
title: Settings
33
---
44

5-
Default settings will be inherited from a configuration file in your application directory. **This is true for python and for the terminal interface.**
5+
Default settings will be inherited from a profile in your application directory. **This is true for python and for the terminal interface.**
66

77
To open the file, run:
88

99
```bash
10-
interpreter --config
11-
```
10+
interpreter --profiles
11+
```

docs/usage/terminal/arguments.mdx

Lines changed: 10 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ title: Arguments
1212

1313
**[Configuration](/docs/usage/terminal/arguments#Configuration)**
1414

15-
`--config`, `--config_file`, `--custom_instructions`, `--system_message`.
15+
`--profiles`, `--profile`, `--custom_instructions`, `--system_message`.
1616

1717
**[Options](/docs/usage/terminal/arguments#options)**
1818

@@ -230,14 +230,12 @@ llm_supports_functions: true
230230
```
231231

232232
</CodeGroup>
233-
#### `--no-llm_supports_functions`
233+
#### `--no-llm_supports_functions`
234234

235235
Inform Open Interpreter that the language model you're using does not support function calling.
236236

237237
<CodeGroup>
238-
```bash Terminal
239-
interpreter --no-llm_supports_functions
240-
```
238+
```bash Terminal interpreter --no-llm_supports_functions ```
241239
</CodeGroup>
242240

243241
#### `--llm_supports_vision` or `-lsv`
@@ -259,24 +257,24 @@ llm_supports_vision: true
259257

260258
## Configuration
261259

262-
#### `--config`
260+
#### `--profiles`
263261

264-
Opens the configuration yaml file in your default editor.
262+
Opens the directory containing all profiles. They can be edited in your default editor.
265263

266264
<CodeGroup>
267265
```bash Terminal
268-
interpreter --config
266+
interpreter --profilees
269267
```
270268

271269
</CodeGroup>
272270

273-
#### `--config_file` or `-cf`
271+
#### `--profile` or `-p`
274272

275-
Optionally set a custom config file to use.
273+
Optionally set a profile to use.
276274

277275
<CodeGroup>
278276
```bash Terminal
279-
interpreter --config_file "/config.yaml"
277+
interpreter --profile "default.yaml"
280278
```
281279

282280
</CodeGroup>
@@ -298,7 +296,7 @@ custom_instructions: "This is a custom instruction."
298296

299297
#### `--system_message` or `-s`
300298

301-
We don't recommend modifying the system message, as doing so opts you out of future updates to the system message. Use `--custom_instructions` instead, to add relevant information to the system message. If you must modify the system message, you can do so by using this argument, or by opening the config file using `--config`.
299+
We don't recommend modifying the system message, as doing so opts you out of future updates to the system message. Use `--custom_instructions` instead, to add relevant information to the system message. If you must modify the system message, you can do so by using this argument, or by opening the profile using `--profiles`.
302300

303301
<CodeGroup>
304302
```bash Terminal
@@ -309,19 +307,6 @@ interpreter --system_message "You are Open Interpreter..."
309307
system_message: "You are Open Interpreter..."
310308
```
311309

312-
</CodeGroup>
313-
314-
#### `--reset_config`
315-
316-
Resets the config file to the default settings.
317-
318-
<CodeGroup>
319-
```bash Terminal
320-
interpreter --reset_config
321-
```
322-
323-
</CodeGroup>
324-
325310
## Options
326311

327312
#### `--safe_mode`

docs/usage/terminal/settings.mdx

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -2,25 +2,25 @@
22
title: Settings
33
---
44

5-
Default settings can be edited via a configuration file. To open the file, run:
5+
Default settings can be edited via a profile. To open the file, run:
66

77
```bash
8-
interpreter --config
8+
interpreter --profiles
99
```
1010

11-
| Key | Value |
12-
|-------------------------|------------------|
13-
| `llm_model` | String ["openai/gpt-4", "openai/local", "azure/gpt-3.5"] |
14-
| `llm_temperature` | Float [0.0 -> 1.0] |
15-
| `llm_supports_vision` | Boolean [True/False] |
16-
| `llm_supports_functions`| Boolean [True/False] |
17-
| `llm_context_window` | Integer [3000] |
18-
| `llm_max_tokens` | Integer [3000] |
19-
| `llm_api_base` | String ["http://ip_address:port", "https://openai.com"] |
20-
| `llm_api_key` | String ["sk-Your-Key"] |
21-
| `llm_api_version` | String ["version-number"] |
22-
| `llm_max_budget` | Float [0.01] #USD $0.01 |
23-
| `offline` | Boolean [True/False] |
24-
| `vision` | Boolean [True/False] |
25-
| `auto_run` | Boolean [True/False] |
26-
| `verbose` | Boolean [True/False] |
11+
| Key | Value |
12+
| ------------------------ | -------------------------------------------------------- |
13+
| `llm_model` | String ["openai/gpt-4", "openai/local", "azure/gpt-3.5"] |
14+
| `llm_temperature` | Float [0.0 -> 1.0] |
15+
| `llm_supports_vision` | Boolean [True/False] |
16+
| `llm_supports_functions` | Boolean [True/False] |
17+
| `llm_context_window` | Integer [3000] |
18+
| `llm_max_tokens` | Integer [3000] |
19+
| `llm_api_base` | String ["http://ip_address:port", "https://openai.com"] |
20+
| `llm_api_key` | String ["sk-Your-Key"] |
21+
| `llm_api_version` | String ["version-number"] |
22+
| `llm_max_budget` | Float [0.01] #USD $0.01 |
23+
| `offline` | Boolean [True/False] |
24+
| `vision` | Boolean [True/False] |
25+
| `auto_run` | Boolean [True/False] |
26+
| `verbose` | Boolean [True/False] |

0 commit comments

Comments
 (0)