Skip to content

Commit 298ac04

Browse files
committed
Update documentation to include all 5 supported LLM providers
- Added Azure OpenAI, Gemini, and vLLM to the list of supported providers - Updated API reference with baseUrl parameter (required for Azure OpenAI and vLLM) - Added provider-specific configuration examples for all 5 providers - Updated prerequisites and conceptual overview to reflect all providers
1 parent 34a4fc4 commit 298ac04

File tree

3 files changed

+144
-24
lines changed

3 files changed

+144
-24
lines changed

guides/ai/getting_started_with_chat.mdx

Lines changed: 76 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ The chat feature is experimental and must be enabled before use. See [experiment
1616

1717
Before starting, ensure you have:
1818
- Meilisearch instance running (v1.15.1 or later)
19-
- An API key from an LLM provider (OpenAI or Mistral)
19+
- An API key from an LLM provider (OpenAI, Azure OpenAI, Mistral, Gemini, or access to a vLLM server)
2020
- At least one index with searchable content
2121
- The chat experimental feature enabled
2222

@@ -38,20 +38,81 @@ curl \
3838

3939
### 2. Configure a chat workspace
4040

41-
Create a workspace with your LLM provider settings:
42-
43-
```bash
44-
curl \
45-
-X PUT 'http://localhost:7700/chats/my-assistant/settings' \
46-
-H 'Authorization: Bearer MASTER_KEY' \
47-
-H 'Content-Type: application/json' \
48-
--data-binary '{
49-
"provider": "openai",
50-
"model": "gpt-3.5-turbo",
51-
"apiKey": "sk-...",
52-
"prompt": "You are a helpful assistant. Answer questions based only on the provided context."
53-
}'
54-
```
41+
Create a workspace with your LLM provider settings. Here are examples for different providers:
42+
43+
<Tabs>
44+
<Tab label="OpenAI">
45+
```bash
46+
curl \
47+
-X PUT 'http://localhost:7700/chats/my-assistant/settings' \
48+
-H 'Authorization: Bearer MASTER_KEY' \
49+
-H 'Content-Type: application/json' \
50+
--data-binary '{
51+
"provider": "openai",
52+
"model": "gpt-3.5-turbo",
53+
"apiKey": "sk-...",
54+
"prompt": "You are a helpful assistant. Answer questions based only on the provided context."
55+
}'
56+
```
57+
</Tab>
58+
<Tab label="Azure OpenAI">
59+
```bash
60+
curl \
61+
-X PUT 'http://localhost:7700/chats/my-assistant/settings' \
62+
-H 'Authorization: Bearer MASTER_KEY' \
63+
-H 'Content-Type: application/json' \
64+
--data-binary '{
65+
"provider": "azure_openai",
66+
"model": "gpt-35-turbo",
67+
"apiKey": "your-azure-key",
68+
"baseUrl": "https://your-resource.openai.azure.com",
69+
"prompt": "You are a helpful assistant. Answer questions based only on the provided context."
70+
}'
71+
```
72+
</Tab>
73+
<Tab label="Mistral">
74+
```bash
75+
curl \
76+
-X PUT 'http://localhost:7700/chats/my-assistant/settings' \
77+
-H 'Authorization: Bearer MASTER_KEY' \
78+
-H 'Content-Type: application/json' \
79+
--data-binary '{
80+
"provider": "mistral",
81+
"model": "mistral-small-latest",
82+
"apiKey": "your-mistral-key",
83+
"prompt": "You are a helpful assistant. Answer questions based only on the provided context."
84+
}'
85+
```
86+
</Tab>
87+
<Tab label="Gemini">
88+
```bash
89+
curl \
90+
-X PUT 'http://localhost:7700/chats/my-assistant/settings' \
91+
-H 'Authorization: Bearer MASTER_KEY' \
92+
-H 'Content-Type: application/json' \
93+
--data-binary '{
94+
"provider": "gemini",
95+
"model": "gemini-1.5-flash",
96+
"apiKey": "your-gemini-key",
97+
"prompt": "You are a helpful assistant. Answer questions based only on the provided context."
98+
}'
99+
```
100+
</Tab>
101+
<Tab label="vLLM">
102+
```bash
103+
curl \
104+
-X PUT 'http://localhost:7700/chats/my-assistant/settings' \
105+
-H 'Authorization: Bearer MASTER_KEY' \
106+
-H 'Content-Type: application/json' \
107+
--data-binary '{
108+
"provider": "vllm",
109+
"model": "meta-llama/Llama-3-8b-chat-hf",
110+
"baseUrl": "http://localhost:8000",
111+
"prompt": "You are a helpful assistant. Answer questions based only on the provided context."
112+
}'
113+
```
114+
</Tab>
115+
</Tabs>
55116

56117
### 3. Send your first chat request
57118

learn/ai_powered_search/conversational_search_with_chat.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ Meilisearch's chat consolidates these into one streamlined process:
6969

7070
The chat feature operates through workspaces, which are isolated configurations for different use cases or tenants. Each workspace can:
7171

72-
- Use different LLM providers (OpenAI, Mistral, etc.)
72+
- Use different LLM providers (OpenAI, Azure OpenAI, Mistral, Gemini, vLLM)
7373
- Apply custom prompts
7474
- Access specific indexes based on API keys
7575
- Maintain separate conversation contexts

reference/api/chats.mdx

Lines changed: 67 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -36,16 +36,18 @@ The chat feature is experimental and must be enabled through [experimental featu
3636
"provider": "openai",
3737
"model": "gpt-3.5-turbo",
3838
"apiKey": "sk-...",
39+
"baseUrl": "https://api.openai.com/v1",
3940
"prompt": "You are a helpful assistant that answers questions based on the provided context."
4041
}
4142
```
4243

43-
| Name | Type | Description |
44-
| :------------- | :----- | :---------------------------------------------------- |
45-
| **`provider`** | String | LLM provider (`"openai"` or `"mistral"`) |
46-
| **`model`** | String | Model identifier (e.g., `"gpt-3.5-turbo"`) |
47-
| **`apiKey`** | String | API key for the LLM provider (write-only) |
48-
| **`prompt`** | String | System prompt to guide the assistant's behavior |
44+
| Name | Type | Description |
45+
| :------------- | :----- | :------------------------------------------------------------------------------------ |
46+
| **`provider`** | String | LLM provider: `"openai"`, `"azure_openai"`, `"mistral"`, `"gemini"`, or `"vllm"` |
47+
| **`model`** | String | Model identifier (e.g., `"gpt-3.5-turbo"`) |
48+
| **`apiKey`** | String | API key for the LLM provider (write-only, optional for vLLM) |
49+
| **`baseUrl`** | String | Base URL for the provider (required for Azure OpenAI and vLLM) |
50+
| **`prompt`** | String | System prompt to guide the assistant's behavior |
4951

5052
## Chat completions
5153

@@ -191,10 +193,10 @@ All fields are optional. Only provided fields will be updated.
191193

192194
Returns the updated settings object. Note that `apiKey` is write-only and will not be returned in the response.
193195

194-
### Example
196+
### Examples
195197

196198
<Tabs>
197-
<Tab label="cURL">
199+
<Tab label="OpenAI">
198200
```bash
199201
curl \
200202
-X PUT 'http://localhost:7700/chats/customer-support/settings' \
@@ -208,6 +210,63 @@ Returns the updated settings object. Note that `apiKey` is write-only and will n
208210
}'
209211
```
210212
</Tab>
213+
<Tab label="Azure OpenAI">
214+
```bash
215+
curl \
216+
-X PUT 'http://localhost:7700/chats/customer-support/settings' \
217+
-H 'Authorization: Bearer MASTER_KEY' \
218+
-H 'Content-Type: application/json' \
219+
--data-binary '{
220+
"provider": "azure_openai",
221+
"model": "gpt-4",
222+
"apiKey": "your-azure-api-key",
223+
"baseUrl": "https://your-resource.openai.azure.com",
224+
"prompt": "You are a helpful customer support assistant."
225+
}'
226+
```
227+
</Tab>
228+
<Tab label="Mistral">
229+
```bash
230+
curl \
231+
-X PUT 'http://localhost:7700/chats/customer-support/settings' \
232+
-H 'Authorization: Bearer MASTER_KEY' \
233+
-H 'Content-Type: application/json' \
234+
--data-binary '{
235+
"provider": "mistral",
236+
"model": "mistral-large-latest",
237+
"apiKey": "your-mistral-api-key",
238+
"prompt": "You are a helpful customer support assistant."
239+
}'
240+
```
241+
</Tab>
242+
<Tab label="Gemini">
243+
```bash
244+
curl \
245+
-X PUT 'http://localhost:7700/chats/customer-support/settings' \
246+
-H 'Authorization: Bearer MASTER_KEY' \
247+
-H 'Content-Type: application/json' \
248+
--data-binary '{
249+
"provider": "gemini",
250+
"model": "gemini-1.5-pro",
251+
"apiKey": "your-gemini-api-key",
252+
"prompt": "You are a helpful customer support assistant."
253+
}'
254+
```
255+
</Tab>
256+
<Tab label="vLLM">
257+
```bash
258+
curl \
259+
-X PUT 'http://localhost:7700/chats/customer-support/settings' \
260+
-H 'Authorization: Bearer MASTER_KEY' \
261+
-H 'Content-Type: application/json' \
262+
--data-binary '{
263+
"provider": "vllm",
264+
"model": "llama-3-8b",
265+
"baseUrl": "http://your-vllm-server:8000",
266+
"prompt": "You are a helpful customer support assistant."
267+
}'
268+
```
269+
</Tab>
211270
</Tabs>
212271

213272
## Get chat settings

0 commit comments

Comments
 (0)