Skip to content

Commit 7239b54

Browse files
committed
update hosted model openai page
1 parent fb48591 commit 7239b54

File tree

3 files changed

+16
-27
lines changed

3 files changed

+16
-27
lines changed

docs/getting-started/introduction.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This provides a natural-language interface to your computer's general-purpose ca
2020

2121
<br/>
2222

23-
<Info>You can also build Open Interpreter into your applications with [our new Python package.](/usage/python/arguments)</Info>
23+
<Info>You can also build Open Interpreter into your applications with [our Python package.](/usage/python/arguments)</Info>
2424

2525
---
2626

docs/getting-started/setup.mdx

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -24,14 +24,6 @@ curl -sL https://raw.githubusercontent.com/KillianLucas/open-interpreter/main/in
2424

2525
These installers will attempt to download Python, set up an environment, and install Open Interpreter for you.
2626

27-
## Terminal usage
28-
29-
After installation, you can start an interactive chat in your terminal by running:
30-
31-
```bash
32-
interpreter
33-
```
34-
3527
## Installation from `pip`
3628

3729
If you already use Python, we recommend installing Open Interpreter via `pip`:
@@ -69,6 +61,18 @@ pip install open-interpreter[os]
6961
pip install open-interpreter[safe]
7062
```
7163

64+
## No Installation
65+
66+
If configuring your computer environment is challenging, you can press the `,` key on the [GitHub page](https://github.com/OpenInterpreter/open-interpreter) to create a codespace. After a moment, you'll receive a cloud virtual machine environment pre-installed with open-interpreter. You can then start interacting with it directly and freely confirm its execution of system commands without worrying about damaging the system.
67+
68+
## Terminal usage
69+
70+
After installation, you can start an interactive chat in your terminal by running:
71+
72+
```bash
73+
interpreter
74+
```
75+
7276
## Python usage
7377

7478
To start an interactive chat in Python, run the following:
@@ -86,7 +90,3 @@ interpreter.chat("Get the last 5 BBC news headlines.")
8690
```
8791

8892
[Click here](/usage/python/streaming-response) to learn how to stream its response into your application.
89-
90-
## No Installation
91-
92-
If configuring your computer environment is challenging, you can press the `,` key on this repository's GitHub page to create a codespace. After a moment, you'll receive a cloud virtual machine environment pre-installed with open-interpreter. You can then start interacting with it directly and freely confirm its execution of system commands without worrying about damaging the system.

docs/language-models/hosted-models/openai.mdx

Lines changed: 3 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -18,12 +18,7 @@ interpreter.chat()
1818

1919
</CodeGroup>
2020

21-
This will default to `gpt-4`, which is the most capable publicly available model for code interpretation (Open Interpreter was designed to be used with `gpt-4`).
22-
23-
<Info>
24-
Trouble accessing `gpt-4`? Read our [gpt-4 setup
25-
article](/language-model-setup/hosted-models/gpt-4-setup).
26-
</Info>
21+
This will default to `gpt-4-turbo`, which is the most capable publicly available model for code interpretation (Open Interpreter was designed to be used with `gpt-4`).
2722

2823
To run a specific model from OpenAI, set the `model` flag:
2924

@@ -49,17 +44,11 @@ We support any model on [OpenAI's models page:](https://platform.openai.com/docs
4944
<CodeGroup>
5045

5146
```bash Terminal
52-
interpreter --model gpt-4
53-
interpreter --model gpt-4-32k
54-
interpreter --model gpt-3.5-turbo
55-
interpreter --model gpt-3.5-turbo-16k
47+
interpreter --model gpt-4o
5648
```
5749

5850
```python Python
59-
interpreter.llm.model = "gpt-4"
60-
interpreter.llm.model = "gpt-4-32k"
61-
interpreter.llm.model = "gpt-3.5-turbo"
62-
interpreter.llm.model = "gpt-3.5-turbo-16k"
51+
interpreter.llm.model = "gpt-4o"
6352
```
6453

6554
</CodeGroup>

0 commit comments

Comments
 (0)