Skip to content

Commit b6b49da

Browse files
committed
Update Blog “part-8-agentic-ai-and-qdrant-building-semantic-memory-with-mcp-protocol”
1 parent 1bae6df commit b6b49da

File tree

1 file changed

+19
-18
lines changed

1 file changed

+19
-18
lines changed

content/blog/part-8-agentic-ai-and-qdrant-building-semantic-memory-with-mcp-protocol.md

Lines changed: 19 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -23,21 +23,19 @@ This is where tools like **Qdrant** and the **Model Context Protocol (MCP)** com
2323
\
2424
[Inspired by my Medium post](https://dineshr1493.medium.com/all-you-need-to-know-about-the-evolution-of-generative-ai-to-agentic-ai-part-8-agentic-ai-mcp-281567e26838), this article explores how **MCP**, the **Model Context Protocol**—a kind of connective tissue between LLMs and external tools or data sources—**standardizes interactions** between intelligent agents and vector databases like **Qdrant**. By enabling seamless storage and retrieval of embeddings, agents can now “remember” useful information and leverage it in future reasoning.
2525

26-
Let’s walk through the full architecture and code implementation of this cutting-edge pattern.
27-
28-
26+
Let’s walk through the full architecture and code implementation of this cutting-edge combination.
2927

3028
## LLMs + MCP + Database = Thoughtful Agentic AI
3129

3230
In Agentic AI, a language model doesn’t just generate — it thinks, acts, and reflects using external tools. That’s where MCP comes in.
3331

3432
Think of MCP as a “USB interface” for AI — it lets agents plug into tools like Qdrant, APIs, or structured databases using a consistent protocol.
3533

36-
Qdrant itself is a high-performance vector database — capable of powering semantic search, knowledge retrieval, and long-term memory for AI agents. However, direct integration with agents can be messy and non-standardized.
34+
Qdrant itself is a high-performance vector database — capable of powering semantic search, knowledge retrieval, and acting as long-term memory for AI agents. However, direct integration with agents can be messy and non-standardized.
3735

3836
This is solved by wrapping Qdrant inside an MCP server, giving agents a semantic API they can call like a function.
3937

40-
### Architecture Overview
38+
### Architecture overview
4139

4240
```cwl
4341
[LLM Agent]
@@ -52,15 +50,18 @@ This is solved by wrapping Qdrant inside an MCP server, giving agents a semantic
5250
[Qdrant Vector DB]
5351
```
5452

55-
### Use Case: Support Ticket Memory for AI Assistants
53+
### Use case: Support ticket memory for AI assistants
5654

5755
Imagine an AI assistant answering support queries.
5856

5957
* It doesn't have all answers built-in.
6058
* But it has semantic memory from prior support logs stored in Qdrant.
61-
* It uses qdrant-find to semantically retrieve similar issues and then formulates a contextual response.
59+
* It uses qdrant-find to semantically retrieve similar issues .
60+
* It then formulates a contextual response.
61+
62+
6263

63-
## Step-by-Step Implementation
64+
## Step-by-step implementation
6465

6566
### Step 1: Launch Qdrant MCP Server
6667

@@ -74,7 +75,7 @@ export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
7475
uvx mcp-server-qdrant --transport sse
7576
```
7677

77-
## Key Parameters:
78+
## Key parameters:
7879

7980
* COLLECTION_NAME: Name of the Qdrant collection
8081
* QDRANT_LOCAL_PATH: Local vector DB storage path
@@ -109,7 +110,7 @@ async with stdio_client(server_params) as (read, write):
109110
Expected Output: Lists tools like qdrant-store, qdrant-find
110111
```
111112

112-
### Step 3: Ingest a New Memory
113+
### Step 3: Ingest a new memory
113114

114115
```python
115116
ticket_info = "Order #1234 was delayed due to heavy rainfall in transit zone."
@@ -121,7 +122,7 @@ result = await session.call_tool("qdrant-store", arguments={
121122

122123
This stores an embedded version of the text in Qdrant.
123124

124-
### Step 4: Perform a Semantic Search
125+
### Step 4: Perform a semantic search
125126

126127
```python
127128
query = "Why was order 1234 delayed?"
@@ -130,7 +131,7 @@ search_response = await session.call_tool("qdrant-find", arguments={
130131
})
131132
```
132133

133-
## Example Output:
134+
### Example output:
134135

135136
```cwl
136137
[
@@ -157,16 +158,16 @@ response = openai.ChatCompletion.create(
157158
model="gpt-3.5-turbo",
158159
messages=[{"role": "user", "content": prompt}]
159160
)
160-
print(response\["choices"]\[0]\["message"]\["content"])
161+
print(response["choices"][0]["message"]["content"])
161162
```
162163

163-
### Final Answer:
164+
### Final answer:
164165

165166
```cwl
166167
"Order #1234 was delayed due to heavy rainfall in the transit zone."
167168
```
168169

169-
## Parameter Reference
170+
## Parameter reference
170171

171172
<table border="1" cellpadding="8" cellspacing="0" style="border-collapse: collapse; width: 100%;">
172173
<thead style="background-color:#f2f2f2">
@@ -205,17 +206,17 @@ print(response\["choices"]\[0]\["message"]\["content"])
205206
</tbody>
206207
</table>
207208

208-
## Pro Tip: Chain MCP servers
209+
## Pro tip: Chain MCP servers
209210

210211
You can deploy multiple MCP servers for different tools and plug them into agent workflows:
211212

212213
* qdrant-find for memory
213214
* google-search for web data
214215
* postgres-query for structured facts
215216

216-
Then orchestrate it all using Agentic AI Teams to perform high-level, multi-tool reasoning.
217+
Then, orchestrate it all using Agentic AI Teams to perform high-level, multi-tool reasoning.
217218

218-
## Final Thought
219+
## Final thoughts
219220

220221
By pairing Qdrant with MCP, Agentic AI gains powerful, semantic memory — a critical enabler of contextual understanding and long-term knowledge retention. This pattern abstracts the complexity of vector DBs behind a unified protocol, empowering agents to think, recall, and act without manual data plumbing.
221222

0 commit comments

Comments
 (0)