You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: learn/chat/conversational_search.mdx
+3-7Lines changed: 3 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,11 +3,7 @@ title: What is conversational search?
3
3
description: Conversational search is an AI-powered feature that allows users to ask questions in everyday language and receive answers based on the information in Meilisearch's indexes
4
4
---
5
5
6
-
## What is conversational search?
7
-
8
-
In conversational search interfaces, users ask questions in everyday language instead of using keywords, and receive complete answers rather than links to articles.
9
-
10
-
## When to use chat completions vs traditional search
6
+
## When to use conversational vs traditional search
11
7
12
8
Use conversational search when:
13
9
@@ -21,7 +17,7 @@ Use traditional search when:
21
17
- Approximative answers are not acceptable
22
18
- Your users need very quick responses
23
19
24
-
## How chat completions differs from traditional search
20
+
## How conversational search usage differs from traditional search
25
21
26
22
### Traditional search workflow
27
23
@@ -41,7 +37,7 @@ Use traditional search when:
41
37
42
38
In the majority of cases, you should use the [`/chats` route](/reference/api/chats) to build a Retrieval Augmented Generation (RAG) pipeline. RAGs excel when working with unstructured data and emphasise high-quality responses.
43
39
44
-
Meilisearch's chat completions consolidates RAG creation into a single process:
40
+
Meilisearch's chat completions API consolidates RAG creation into a single process:
45
41
46
42
1.**Query understanding**: automatically transforms questions into search parameters
47
43
2.**Hybrid retrieval**: combines keyword and semantic search for better relevancy
description: This guide walks you through implementing Meilisearch's chat completions feature to create conversational search experiences in your application
3
+
description: This article walks you through implementing Meilisearch's chat completions feature to create conversational search experiences in your application.
4
4
---
5
5
6
-
Chat completions have three key components: index integration, workspace configuration, and the chat interface.
7
-
8
-
operate through workspaces, which are isolated configurations for different use cases. Each workspace can:
9
-
10
-
- Use different LLM sources (openAi, azureOpenAi, mistral, gemini, vLlm)
To successfully implement a conversational search interface you must follow three steps: configure indexes for chat usage, create chat workspaces targeting different use-cases, and building a chat interface.
31
7
32
8
## Prerequisites
33
9
@@ -82,11 +58,9 @@ curl \
82
58
}'
83
59
```
84
60
85
-
## Configure your indexes for chat
61
+
## Configure your indexes
86
62
87
-
After activating the `/chats` route and obtaining an API key with chat access, you must configure the indexes your conversational interface has access to.
88
-
89
-
Configure the `chat` settings for each index you want to be searchable via chat UI:
63
+
After activating the `/chats` route and obtaining an API key with chat permissions, configure the `chat` settings for each index you want to be searchable via chat UI:
90
64
91
65
```bash
92
66
curl \
@@ -103,17 +77,23 @@ curl \
103
77
```
104
78
105
79
-`description` gives the initial context of the conversation to the LLM. A good description improves relevance of the chat's answers
106
-
-`documentTemplate` defines which document fields Meilisearch will send the AI provider. Consult the best [document template best practices](/learn/ai_powered_search/document_template_best_practices) article for more guidance
80
+
-`documentTemplate` defines the document data Meilisearch sends to the AI provider. Consult the best [document template best practices](/learn/ai_powered_search/document_template_best_practices) article for more guidance
107
81
108
82
## Configure a chat completions workspace
109
83
110
-
The next step is to create a workspace. Chat completion workspaces are isolated configurations targeting different use cases. For example, you may have one workspace for publicly visible data, and another for data only available for logged in users.
84
+
The next step is to create a workspace. Chat completion workspaces are isolated configurations targeting different use cases. Each workspace can:
85
+
86
+
- Use different embedding providers (openAi, azureOpenAi, mistral, gemini, vLlm)
87
+
- Establish separate conversation contexts via baseline prompts
88
+
- Access a specific set of indexes
89
+
90
+
For example, you may have one workspace for publicly visible data, and another for data only available for logged in users.
111
91
112
92
Create a workspace setting your LLM provider as its `source`:
@@ -191,11 +171,11 @@ Which fields are mandatory will depend on your chosen provider `source`. In most
191
171
192
172
`baseUrl` indicates the URL Meilisearch queries when users submit questions to your chat interface.
193
173
194
-
`prompts.system` gives the conversational search bot the baseline context of your users and their questions.
174
+
`prompts.system` gives the conversational search bot the baseline context of your users and their questions. The `prompts` object accepts a few other fields that provide more information to improve how the agent uses the information it finds via Meilisearch. In real-life scenarios filling these fields would improve the quality of conversational search results.
195
175
196
176
## Send your first chat completions request
197
177
198
-
You have finished configuring your conversational search agent. Use `curl` in your terminal to confirm everything is working. Sending a streaming query to the chat completions API route:
178
+
You have finished configuring your conversational search agent. To test everything is working as expected, send a streaming`curl` query to the chat completions API route:
199
179
200
180
```bash
201
181
curl -N \
@@ -251,11 +231,9 @@ In most cases, that is only the beginning of adding conversational search to you
251
231
252
232
### Building a chat interface using the OpenAI SDK
253
233
254
-
Creating a full chat interface is out of scope for this tutorial, but here is one important tip.
234
+
Meilisearch's chat endpoint was designed to be OpenAI-compatible. This means you can use the official OpenAI SDK in any supported programming language, even if your provider is not OpenAI.
255
235
256
-
Meilisearch's chat endpoint was designed to be OpenAI-compatible. This means you can use the official OpenAI SDK in any supported programming language, even if your provider is not OpenAI:
257
-
258
-
<CodeGroup>
236
+
Integrating Meiliearch and the OpenAI SDK with JavaScript would look lik this:
259
237
260
238
```javascript JavaScript
261
239
importOpenAIfrom'openai';
@@ -275,97 +253,7 @@ for await (const chunk of completion) {
Take particular note of the last lines, which output the streamed responses to the browser console. In a real-life application, you would instead print the response chunks to the user interface.
0 commit comments