Skip to content

Commit 30df78f

Browse files
authored
Merge pull request #1258 from rbrisita/fix_doc
Fix Documentation
2 parents aa21637 + 271d60f commit 30df78f

File tree

1 file changed

+7
-8
lines changed

1 file changed

+7
-8
lines changed

docs/language-models/custom-models.mdx

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -9,21 +9,20 @@ As long as your system can accept an input and stream an output (and can be inte
99
Simply replace the OpenAI-compatible `completions` function in your language model with one of your own:
1010

1111
```python
12-
def custom_language_model(openai_message):
12+
def custom_language_model(messages, model, stream, max_tokens):
1313
"""
1414
OpenAI-compatible completions function (this one just echoes what the user said back).
15+
To make it OpenAI-compatible and parsable, `choices` has to be the root property.
16+
The property `delta` is used to signify streaming.
1517
"""
16-
users_content = openai_message[-1].get("content") # Get last message's content
17-
18-
# To make it OpenAI-compatible, we yield this first:
19-
yield {"delta": {"role": "assistant"}}
18+
users_content = messages[-1].get("content") # Get last message's content
2019

2120
for character in users_content:
22-
yield {"delta": {"content": character}}
21+
yield {"choices": [{"delta": {"content": character}}]}
2322

2423
# Tell Open Interpreter to power the language model with this function
2524

26-
interpreter.llm.completion = custom_language_model
25+
interpreter.llm.completions = custom_language_model
2726
```
2827

2928
Then, set the following settings:
@@ -39,4 +38,4 @@ And start using it:
3938

4039
```
4140
interpreter.chat("Hi!") # Returns/displays "Hi!" character by character
42-
```
41+
```

0 commit comments

Comments
 (0)