Skip to content

Commit d667457

Browse files
committed
\Update chat completions examples with OpenAI SDK
- Remove OpenAI SDK compatibility section from API reference - Add OpenAI SDK examples (JavaScript and Python) to chat completions section - Keep cURL example alongside SDK examples\
1 parent 5e7e71c commit d667457

File tree

1 file changed

+33
-49
lines changed

1 file changed

+33
-49
lines changed

reference/api/chats.mdx

Lines changed: 33 additions & 49 deletions
Original file line numberDiff line numberDiff line change
@@ -126,37 +126,44 @@ curl \
126126
}'
127127
```
128128

129-
```javascript JavaScript
130-
const response = await fetch('http://localhost:7700/chats/customer-support/chat/completions', {
131-
method: 'POST',
132-
headers: {
133-
'Authorization': 'Bearer DEFAULT_CHAT_KEY',
134-
'Content-Type': 'application/json'
135-
},
136-
body: JSON.stringify({
137-
model: 'gpt-3.5-turbo',
138-
messages: [
139-
{
140-
role: 'user',
141-
content: 'What is Meilisearch?'
142-
}
143-
],
144-
stream: true
145-
})
129+
```javascript "OpenAI SDK"
130+
import OpenAI from 'openai';
131+
132+
const client = new OpenAI({
133+
baseURL: 'http://localhost:7700/chats/customer-support',
134+
apiKey: 'DEFAULT_CHAT_KEY',
146135
});
147136

148-
const reader = response.body.getReader();
149-
const decoder = new TextDecoder();
137+
const stream = await client.chat.completions.create({
138+
model: 'gpt-3.5-turbo',
139+
messages: [{ role: 'user', content: 'What is Meilisearch?' }],
140+
stream: true,
141+
});
150142

151-
while (true) {
152-
const { done, value } = await reader.read();
153-
if (done) break;
154-
155-
const chunk = decoder.decode(value);
156-
console.log(chunk);
143+
for await (const chunk of stream) {
144+
console.log(chunk.choices[0]?.delta?.content || '');
157145
}
158146
```
159147
148+
```python "OpenAI SDK"
149+
from openai import OpenAI
150+
151+
client = OpenAI(
152+
base_url="http://localhost:7700/chats/customer-support",
153+
api_key="DEFAULT_CHAT_KEY"
154+
)
155+
156+
stream = client.chat.completions.create(
157+
model="gpt-3.5-turbo",
158+
messages=[{"role": "user", "content": "What is Meilisearch?"}],
159+
stream=True,
160+
)
161+
162+
for chunk in stream:
163+
if chunk.choices[0].delta.content is not None:
164+
print(chunk.choices[0].delta.content, end="")
165+
```
166+
160167
</CodeGroup>
161168
162169
## Update chat settings
@@ -378,27 +385,4 @@ These tools are handled internally and are not directly accessible through the A
378385
- Only streaming responses are currently supported
379386
- Conversation history must be managed client-side
380387
- Token limits depend on the chosen LLM provider
381-
- No built-in conversation persistence
382-
383-
## OpenAI SDK compatibility
384-
385-
The chat completions endpoint is compatible with the OpenAI SDK:
386-
387-
```javascript
388-
import OpenAI from 'openai';
389-
390-
const client = new OpenAI({
391-
baseURL: 'http://localhost:7700/chats/customer-support',
392-
apiKey: 'YOUR_MEILISEARCH_API_KEY',
393-
});
394-
395-
const completion = await client.chat.completions.create({
396-
model: 'gpt-3.5-turbo',
397-
messages: [{ role: 'user', content: 'What is Meilisearch?' }],
398-
stream: true,
399-
});
400-
401-
for await (const chunk of completion) {
402-
console.log(chunk.choices[0]?.delta?.content || '');
403-
}
404-
```
388+
- No built-in conversation persistence

0 commit comments

Comments
 (0)