1
1
---
2
- title : " Part 8: Agentic AI and Qdrant: Building Semantic Memory with MCP Protocol "
2
+ title : " Part 8: Agentic AI and Qdrant: Building semantic Memory with MCP protocol "
3
3
date : 2025-07-21T10:50:25.839Z
4
4
author : Dinesh R Singh
5
5
authorimage : /img/dinesh-192-192.jpg
@@ -35,7 +35,7 @@ This is solved by wrapping Qdrant inside an MCP server, giving agents a semantic
35
35
36
36
### Architecture Overview
37
37
38
- ```
38
+ ``` cwl
39
39
[LLM Agent]
40
40
|
41
41
|-- [MCP Client]
@@ -60,13 +60,13 @@ Imagine an AI assistant answering support queries.
60
60
61
61
### Step 1: Launch Qdrant MCP Server
62
62
63
- ```
63
+ ``` cwl
64
64
export COLLECTION_NAME="support-tickets"
65
65
export QDRANT_LOCAL_PATH="./qdrant_local_db"
66
66
export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
67
67
```
68
68
69
- ```
69
+ ``` cwl
70
70
uvx mcp-server-qdrant --transport sse
71
71
```
72
72
@@ -78,7 +78,7 @@ uvx mcp-server-qdrant --transport sse
78
78
79
79
### Step 2: Connect the MCP Client
80
80
81
- ```
81
+ ``` python
82
82
from mcp import ClientSession, StdioServerParameters
83
83
from mcp.client.stdio import stdio_client
84
84
async def main ():
@@ -93,21 +93,21 @@ server_params = StdioServerParameters(
93
93
)
94
94
```
95
95
96
- ```
96
+ ``` python
97
97
async with stdio_client(server_params) as (read, write):
98
98
async with ClientSession(read, write) as session:
99
99
await session.initialize()
100
100
tools = await session.list_tools()
101
101
print (tools)
102
102
```
103
103
104
- ```
104
+ ``` cwl
105
105
Expected Output: Lists tools like qdrant-store, qdrant-find
106
106
```
107
107
108
108
### Step 3: Ingest a New Memory
109
109
110
- ```
110
+ ``` python
111
111
ticket_info = " Order #1234 was delayed due to heavy rainfall in transit zone."
112
112
result = await session.call_tool(" qdrant-store" , arguments = {
113
113
" information" : ticket_info,
@@ -119,7 +119,7 @@ This stores an embedded version of the text in Qdrant.
119
119
120
120
### Step 4: Perform a Semantic Search
121
121
122
- ```
122
+ ``` python
123
123
query = " Why was order 1234 delayed?"
124
124
search_response = await session.call_tool(" qdrant-find" , arguments = {
125
125
" query" : " order 1234 delay"
@@ -128,7 +128,7 @@ search_response = await session.call_tool("qdrant-find", arguments={
128
128
129
129
## Example Output:
130
130
131
- ```
131
+ ``` cwl
132
132
[
133
133
{
134
134
"content": "Order #1234 was delayed due to heavy rainfall in transit zone.",
@@ -139,7 +139,7 @@ search_response = await session.call_tool("qdrant-find", arguments={
139
139
140
140
### Step 5: Use with LLM
141
141
142
- ```
142
+ ``` python
143
143
import openai
144
144
context = " \n " .join(\[r["content"] for r in search_response])
145
145
prompt = f """
@@ -156,8 +156,9 @@ messages=[{"role": "user", "content": prompt}]
156
156
print (response\["choices"]\[0]\["message"]\["content"])
157
157
```
158
158
159
- ```
160
- Final Answer:
159
+ # ## Final Answer:
160
+
161
+ ```cwl
161
162
" Order #1234 was delayed due to heavy rainfall in the transit zone."
162
163
```
163
164
0 commit comments