Skip to content

Commit 70dfa23

Browse files
renames session manager to message history (#76)
1 parent dc404ce commit 70dfa23

File tree

4 files changed

+54
-54
lines changed

4 files changed

+54
-54
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -71,11 +71,11 @@ To get started with RAG, either from scratch or using a popular framework like L
7171
| [/RAG/07_user_role_based_rag.ipynb](python-recipes/RAG/07_user_role_based_rag.ipynb) | Implement a simple RBAC policy with vector search using Redis |
7272

7373
### LLM Memory
74-
LLMs are stateless. To maintain context within a conversation chat sessions must be stored and resent to the LLM. Redis manages the storage and retrieval of chat sessions to maintain context and conversational relevance.
74+
LLMs are stateless. To maintain context within a conversation chat sessions must be stored and resent to the LLM. Redis manages the storage and retrieval of message histories to maintain context and conversational relevance.
7575
| Recipe | Description |
7676
| --- | --- |
77-
| [/llm-session-manager/00_session_manager.ipynb](python-recipes/llm-session-manager/00_llm_session_manager.ipynb) | LLM session manager with semantic similarity |
78-
| [/llm-session-manager/01_multiple_sessions.ipynb](python-recipes/llm-session-manager/01_multiple_sessions.ipynb) | Handle multiple simultaneous chats with one instance |
77+
| [/llm-message-history/00_message_history.ipynb](python-recipes/llm-message-history/00_llm_message_history.ipynb) | LLM message history with semantic similarity |
78+
| [/llm-message-history/01_multiple_sessions.ipynb](python-recipes/llm-message-history/01_multiple_sessions.ipynb) | Handle multiple simultaneous chats with one instance |
7979

8080
### Semantic Cache
8181
An estimated 31% of LLM queries are potentially redundant ([source](https://arxiv.org/pdf/2403.02694)). Redis enables semantic caching to help cut down on LLM costs quickly.

python-recipes/RAG/07_user_role_based_rag.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@
6060
}
6161
],
6262
"source": [
63-
"%pip install -q \"redisvl>=0.4.1\" openai langchain-community pypdf"
63+
"%pip install -q \"redisvl>=0.6.0\" openai langchain-community pypdf"
6464
]
6565
},
6666
{
@@ -1335,7 +1335,7 @@
13351335
"from typing import List, Optional\n",
13361336
"import os\n",
13371337
"\n",
1338-
"from redisvl.extensions.session_manager import StandardSessionManager\n",
1338+
"from redisvl.extensions.message_history import MessageHistory\n",
13391339
"\n",
13401340
"\n",
13411341
"class RAGChatManager:\n",
@@ -1395,7 +1395,7 @@
13951395
" user_id: User identifier\n",
13961396
" \"\"\"\n",
13971397
" if user_id not in self.sessions:\n",
1398-
" self.sessions[user_id] = StandardSessionManager(\n",
1398+
" self.sessions[user_id] = MessageHistory(\n",
13991399
" name=f\"session:{user_id}\",\n",
14001400
" redis_client=self.kb.redis_client\n",
14011401
" )\n",

python-recipes/llm-session-manager/00_llm_session_manager.ipynb renamed to python-recipes/llm-message-history/00_llm_message_history.ipynb

Lines changed: 26 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -6,16 +6,16 @@
66
"source": [
77
"![Redis](https://redis.io/wp-content/uploads/2024/04/Logotype.svg?auto=webp&quality=85,75&width=120)\n",
88
"\n",
9-
"# LLM Session Memory - Multiple Sessions\n",
9+
"# LLM Message History\n",
1010
"\n",
11-
"Large Language Models are inherently stateless and have no knowledge of previous interactions with a user, or even of previous parts of the current conversation. While this may not be noticeable when asking simple questions, it becomes a hinderance when engaging in long running conversations that rely on conversational context.\n",
11+
"Large Language Models are inherently stateless and have no knowledge of previous interactions with a user, or even of previous parts of the current conversation. While this may not be noticeable when asking simple questions, it becomes a hindrance when engaging in long running conversations that rely on conversational context.\n",
1212
"\n",
1313
"The solution to this problem is to append the previous conversation history to each subsequent call to the LLM.\n",
1414
"\n",
15-
"This notebook will show how to use Redis to structure and store and retrieve this conversational session memory.\n",
15+
"This notebook will show how to use Redis to structure and store and retrieve this conversational message history.\n",
1616
"\n",
1717
"## Let's Begin!\n",
18-
"<a href=\"https://colab.research.google.com/github/redis-developer/redis-ai-resources/blob/main/python-recipes/session-manager/00_session_manager.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n"
18+
"<a href=\"https://colab.research.google.com/github/redis-developer/redis-ai-resources/blob/main/python-recipes/llm-message-history/00_message_history.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n"
1919
]
2020
},
2121
{
@@ -31,7 +31,7 @@
3131
"metadata": {},
3232
"outputs": [],
3333
"source": [
34-
"%pip install cohere \"redisvl>=0.4.1\" sentence-transformers"
34+
"%pip install cohere \"redisvl>=0.6.0\" sentence-transformers"
3535
]
3636
},
3737
{
@@ -153,7 +153,7 @@
153153
" return response.text\n",
154154
"\n",
155155
" def remap(self, context) -> List[Dict]:\n",
156-
" ''' re-index the chat history to match the Cohere API requirements '''\n",
156+
" ''' re-index the message history to match the Cohere API requirements '''\n",
157157
" new_context = []\n",
158158
" for statement in context:\n",
159159
" if statement[\"role\"] == \"user\":\n",
@@ -174,9 +174,9 @@
174174
"cell_type": "markdown",
175175
"metadata": {},
176176
"source": [
177-
"### Import SemanticSessionManager\n",
177+
"### Import MessageHistory\n",
178178
"\n",
179-
"redisvl provides the SemanticSessionManager for easy management of session state."
179+
"redisvl provides the MessageHistory and SemanticMessageHistory classes for easy management of LLM conversations."
180180
]
181181
},
182182
{
@@ -185,10 +185,10 @@
185185
"metadata": {},
186186
"outputs": [],
187187
"source": [
188-
"from redisvl.extensions.session_manager import SemanticSessionManager\n",
188+
"from redisvl.extensions.message_history import SemanticMessageHistory\n",
189189
"\n",
190-
"user_session = SemanticSessionManager(name=\"llm chef\")\n",
191-
"user_session.add_message({\"role\":\"system\", \"content\":\"You are a helpful chef, assisting people in making delicious meals\"})"
190+
"user_history = SemanticMessageHistory(name=\"llm chef\")\n",
191+
"user_history.add_message({\"role\":\"system\", \"content\":\"You are a helpful chef, assisting people in making delicious meals\"})"
192192
]
193193
},
194194
{
@@ -224,9 +224,9 @@
224224
],
225225
"source": [
226226
"prompt = \"can you give me some ideas for breakfast?\"\n",
227-
"context = user_session.get_recent()\n",
227+
"context = user_history.get_recent()\n",
228228
"response = client.converse(prompt=prompt, context=context)\n",
229-
"user_session.store(prompt, response)\n",
229+
"user_history.store(prompt, response)\n",
230230
"print('USER: ', prompt)\n",
231231
"print('\\nLLM: ', response)"
232232
]
@@ -286,9 +286,9 @@
286286
],
287287
"source": [
288288
"prompt = \"can you give me the recipe for those pancakes?\"\n",
289-
"context = user_session.get_recent()\n",
289+
"context = user_history.get_recent()\n",
290290
"response = client.converse(prompt=prompt, context=context)\n",
291-
"user_session.store(prompt, response)\n",
291+
"user_history.store(prompt, response)\n",
292292
"print('USER: ', prompt)\n",
293293
"print('\\nLLM: ', response)"
294294
]
@@ -360,9 +360,9 @@
360360
],
361361
"source": [
362362
"prompt =\"I am vegetarian. Can you remove the eggs?\"\n",
363-
"context = user_session.get_recent()\n",
363+
"context = user_history.get_recent()\n",
364364
"response = client.converse(prompt=prompt, context=context)\n",
365-
"user_session.store(prompt, response)\n",
365+
"user_history.store(prompt, response)\n",
366366
"print('USER: ', prompt)\n",
367367
"print('\\nLLM: ', response)"
368368
]
@@ -436,9 +436,9 @@
436436
],
437437
"source": [
438438
"prompt = \"I am also vegan. Can you replace the butter too?\"\n",
439-
"context = user_session.get_recent()\n",
439+
"context = user_history.get_recent()\n",
440440
"response = client.converse(prompt=prompt, context=context)\n",
441-
"user_session.store(prompt, response)\n",
441+
"user_history.store(prompt, response)\n",
442442
"print('USER: ', prompt)\n",
443443
"print('\\nLLM: ', response)"
444444
]
@@ -521,9 +521,9 @@
521521
],
522522
"source": [
523523
"prompt = \"I changed my mind. Can you give me the first recipe from your list?\"\n",
524-
"context = user_session.get_recent(top_k=5)\n",
524+
"context = user_history.get_recent(top_k=5)\n",
525525
"response = client.converse(prompt=prompt, context=context)\n",
526-
"user_session.store(prompt, response)\n",
526+
"user_history.store(prompt, response)\n",
527527
"print('USER: ', prompt)\n",
528528
"print('\\nLLM: ', response)"
529529
]
@@ -561,7 +561,7 @@
561561
"cell_type": "markdown",
562562
"metadata": {},
563563
"source": [
564-
"## Semantic session memory"
564+
"## Semantic message history"
565565
]
566566
},
567567
{
@@ -608,10 +608,10 @@
608608
],
609609
"source": [
610610
"prompt = \"Can you give me the avocado one?\"\n",
611-
"user_session.set_distance_threshold(0.75)\n",
612-
"context = user_session.get_relevant(prompt=prompt)\n",
611+
"user_history.set_distance_threshold(0.75)\n",
612+
"context = user_history.get_relevant(prompt=prompt)\n",
613613
"response = client.converse(prompt=prompt, context=context)\n",
614-
"user_session.store(prompt, response)\n",
614+
"user_history.store(prompt, response)\n",
615615
"print('USER: ', prompt)\n",
616616
"print('\\nLLM: ', response)"
617617
]
@@ -648,7 +648,7 @@
648648
"metadata": {},
649649
"outputs": [],
650650
"source": [
651-
"user_session.clear()"
651+
"user_history.clear()"
652652
]
653653
}
654654
],

python-recipes/llm-session-manager/01_multiple_sessions.ipynb renamed to python-recipes/llm-message-history/01_multiple_sessions.ipynb

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,13 @@
66
"source": [
77
"![Redis](https://redis.io/wp-content/uploads/2024/04/Logotype.svg?auto=webp&quality=85,75&width=120)\n",
88
"\n",
9-
"# LLM Session Memory - Multiple Sessions\n",
9+
"# LLM Message History - Multiple Sessions\n",
1010
"\n",
1111
"Large Language Models are inherently stateless and have no knowledge of previous interactions with a user, or even of previous parts of the current conversation. The solution to this problem is to append the previous conversation history to each subsequent call to the LLM.\n",
12-
"This notebook will show how to use Redis to structure and store and retrieve this conversational session memory and how to manage multiple sessions simultaneously.\n",
12+
"This notebook will show how to use Redis to structure and store and retrieve this conversational message history and how to manage multiple conversation sessions simultaneously.\n",
1313
"\n",
1414
"## Let's Begin!\n",
15-
"<a href=\"https://colab.research.google.com/github/redis-developer/redis-ai-resources/blob/main/python-recipes/session-manager/01_multiple_sessions.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n"
15+
"<a href=\"https://colab.research.google.com/github/redis-developer/redis-ai-resources/blob/main/python-recipes/llm-message-history/01_multiple_sessions.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n"
1616
]
1717
},
1818
{
@@ -28,7 +28,7 @@
2828
"metadata": {},
2929
"outputs": [],
3030
"source": [
31-
"%pip install cohere \"redisvl>=0.4.1\" sentence-transformers"
31+
"%pip install cohere \"redisvl>=0.6.0\" sentence-transformers"
3232
]
3333
},
3434
{
@@ -150,7 +150,7 @@
150150
" return response.text\n",
151151
"\n",
152152
" def remap(self, context) -> List[Dict]:\n",
153-
" ''' re-index the chat history to match the Cohere API requirements '''\n",
153+
" ''' re-index the message history to match the Cohere API requirements '''\n",
154154
" new_context = []\n",
155155
" for statement in context:\n",
156156
" if statement[\"role\"] == \"user\":\n",
@@ -171,9 +171,9 @@
171171
"cell_type": "markdown",
172172
"metadata": {},
173173
"source": [
174-
"### Import SemanticSessionManager\n",
174+
"### Import SemanticMessageHistory\n",
175175
"\n",
176-
"redisvl provides the SemanticSessionManager for easy management of session state.\n",
176+
"redisvl provides the SemanticMessageHistory for easy management of conversational message history state.\n",
177177
"It also allows for tagging of messages to separate conversation sessions with the `session_tag` optional parameter.\n",
178178
"Let's create a few personas that can talk to our AI.\n"
179179
]
@@ -195,16 +195,16 @@
195195
"metadata": {},
196196
"outputs": [],
197197
"source": [
198-
"from redisvl.extensions.session_manager import SemanticSessionManager\n",
198+
"from redisvl.extensions.message_history import SemanticMessageHistory\n",
199199
"\n",
200-
"session = SemanticSessionManager(name='budgeting help')"
200+
"history = SemanticMessageHistory(name='budgeting help')"
201201
]
202202
},
203203
{
204204
"cell_type": "markdown",
205205
"metadata": {},
206206
"source": [
207-
"#### Here we'll have multiple separate conversations simultaneously, all using the same session manager.\n",
207+
"#### Here we'll have multiple separate conversations simultaneously, all using the same message history object.\n",
208208
"#### Let's add some conversation history to get started.\n",
209209
"\n",
210210
"#### We'll assign each message to one of our users with their own `session_tag`."
@@ -217,7 +217,7 @@
217217
"outputs": [],
218218
"source": [
219219
"# adding messages to the student session\n",
220-
"session.add_messages(\n",
220+
"history.add_messages(\n",
221221
" [{\"role\":\"system\",\n",
222222
" \"content\":\"You are a personal assistant helping people create sound financial budgets. Be very brief and concise in your responses.\"},\n",
223223
" {\"role\":\"user\",\n",
@@ -230,7 +230,7 @@
230230
" session_tag=student)\n",
231231
"\n",
232232
"#adding messages to the young professional session\n",
233-
"session.add_messages(\n",
233+
"history.add_messages(\n",
234234
" [{\"role\":\"system\",\n",
235235
" \"content\":\"You are a personal assistant helping people create sound financial budgets. Be very brief and concise in your responses.\"},\n",
236236
" {\"role\":\"user\",\n",
@@ -243,7 +243,7 @@
243243
" session_tag=yp)\n",
244244
"\n",
245245
"#adding messages to the retiree session\n",
246-
"session.add_messages(\n",
246+
"history.add_messages(\n",
247247
" [{\"role\":\"system\",\n",
248248
" \"content\":\"You are a personal assistant helping people create sound financial budgets. Be very brief and concise in your responses.\"},\n",
249249
" {\"role\":\"user\",\n",
@@ -260,7 +260,7 @@
260260
"cell_type": "markdown",
261261
"metadata": {},
262262
"source": [
263-
"#### With the same session manager calling the same LLM we can handle distinct conversations. There's no need to instantiate separate classes or clients.\n",
263+
"#### With the same message history instance and calling the same LLM we can handle distinct conversations. There's no need to instantiate separate classes or clients.\n",
264264
"\n",
265265
"#### Just retrieve the conversation of interest using the same `session_tag` parameter when fetching context."
266266
]
@@ -282,9 +282,9 @@
282282
],
283283
"source": [
284284
"prompt = \"What is the single most important thing I should focus on financially?\"\n",
285-
"context = session.get_recent(session_tag=student)\n",
285+
"context = history.get_recent(session_tag=student)\n",
286286
"response = client.converse(prompt=prompt, context=context)\n",
287-
"session.store(prompt, response, session_tag=student)\n",
287+
"history.store(prompt, response, session_tag=student)\n",
288288
"print('Student: ', prompt)\n",
289289
"print('\\nLLM: ', response)"
290290
]
@@ -306,9 +306,9 @@
306306
],
307307
"source": [
308308
"prompt = \"What is the single most important thing I should focus on financially?\"\n",
309-
"context = session.get_recent(session_tag=yp)\n",
309+
"context = history.get_recent(session_tag=yp)\n",
310310
"response = client.converse(prompt=prompt, context=context)\n",
311-
"session.store(prompt, response, session_tag=yp)\n",
311+
"history.store(prompt, response, session_tag=yp)\n",
312312
"print('Young Professional: ', prompt)\n",
313313
"print('\\nLLM: ', response)"
314314
]
@@ -330,9 +330,9 @@
330330
],
331331
"source": [
332332
"prompt = \"What is the single most important thing I should focus on financially?\"\n",
333-
"context = session.get_recent(session_tag=retired)\n",
333+
"context = history.get_recent(session_tag=retired)\n",
334334
"response = client.converse(prompt=prompt, context=context)\n",
335-
"session.store(prompt, response, session_tag=retired)\n",
335+
"history.store(prompt, response, session_tag=retired)\n",
336336
"print('Retiree: ', prompt)\n",
337337
"print('\\nLLM: ', response)"
338338
]
@@ -362,7 +362,7 @@
362362
}
363363
],
364364
"source": [
365-
"for ctx in session.get_recent(session_tag=student):\n",
365+
"for ctx in history.get_recent(session_tag=student):\n",
366366
" print(ctx)"
367367
]
368368
},
@@ -372,7 +372,7 @@
372372
"metadata": {},
373373
"outputs": [],
374374
"source": [
375-
"session.clear()"
375+
"history.clear()"
376376
]
377377
}
378378
],

0 commit comments

Comments
 (0)