diff --git a/examples/Long_Term_Memory.ipynb b/examples/Long_Term_Memory.ipynb
new file mode 100644
index 000000000..3a24b2cda
--- /dev/null
+++ b/examples/Long_Term_Memory.ipynb
@@ -0,0 +1,651 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "_C3My7K4rz4I"
+ },
+ "source": [
+ "##### Copyright 2025 Google LLC."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "cellView": "form",
+ "id": "8TeA4FziWLdQ"
+ },
+ "outputs": [],
+ "source": [
+ "# @title Licensed under the Apache License, Version 2.0 (the \"License\");\n",
+ "#\n",
+ "# Licensed under the Apache License, Version 2.0 (the \"License\");\n",
+ "# you may not use this file except in compliance with the License.\n",
+ "# You may obtain a copy of the License at\n",
+ "#\n",
+ "# https://www.apache.org/licenses/LICENSE-2.0\n",
+ "#\n",
+ "# Unless required by applicable law or agreed to in writing, software\n",
+ "# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
+ "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
+ "# See the License for the specific language governing permissions and\n",
+ "# limitations under the License."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "4L89T6n6qibs"
+ },
+ "source": [
+ "## Long Memory Layer using - Gemini and Qdrant using Mem0\n",
+ "\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "IdmdApM4su03"
+ },
+ "source": [
+ "## Overview\n",
+ "\n",
+ "LLMs have no native memory. Their “awareness” is limited to what fits in the context window and once that window scrolls, everything vanishes. This works for one-off tasks but fails catastrophically in real-world use:\n",
+ "\n",
+ "- You set your AI travel bot to “only book window seats.” It books you an middle seater — again\n",
+ "- You tell your AI grocery list app you’re allergic to nuts. Next week, it suggests almond milk.\n",
+ "- A personal planner suggests a morning meeting, even though you’ve said repeatedly you don’t work before 10 a.m.\n",
+ "\n",
+ "These aren’t bugs they’re symptoms of statelessness. Memory persistence solves this. It gives applications the ability to remember what happened earlier and apply that learning later. \n",
+ "\n",
+ "In this notebook, we will build a personalized travel agent with a long-term memory layer that can store and retrieve your preferences when recommending travel destinations and planning itineraries. The memory layer should be able to add, update, and search interactions based on your preferences. You will also see how to use this Memory with Gemini and [Qdrant](https://cloud.qdrant.io/) Client by also configuring the SYSTEM PROMPT.\n",
+ "\n",
+ "> Author Details\n",
+ "\n",
+ "- Author: Tarun Jain\n",
+ "- GitHub: [@lucifertrj](https://github.com/lucifertrj)\n",
+ "- LinkedIn: [@Tarun R Jain](linkedin.com/in/jaintarun75)\n",
+ "\n",
+ "## Prerequisites\n",
+ "\n",
+ "You can run this quickstart in Google Colab.\n",
+ "\n",
+ "To complete this quickstart on your own development environment, ensure that your environment meets the following requirements:\n",
+ "\n",
+ "- Python 3.11+\n",
+ "- An installation of `jupyter` to run the notebook.\n",
+ "\n",
+ "## Setup\n",
+ "\n",
+ "First, download and install the Gemini API Python library and Mem0 package."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "UrXw_1iWpElW"
+ },
+ "outputs": [],
+ "source": [
+ "%pip install mem0ai google-genai"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "6srtuFbJtmlD"
+ },
+ "source": [
+ "## Grab an API Key\n",
+ "\n",
+ "Before you can use the Gemini API, you must first obtain an API key. If you don't already have one, create a key with one click in Google AI Studio.\n",
+ "\n",
+ "Get an API key\n",
+ "\n",
+ "In Colab, add the key to the secrets manager under the \"🔑\" in the left panel. Give it the name `GEMINI_API_KEY`.\n",
+ "\n",
+ "Once you have the API key, pass it to the SDK. You can do this in two ways:\n",
+ "\n",
+ "* Put the key in the `GEMINI_API_KEY` environment variable (the SDK will automatically pick it up from there).\n",
+ "* Pass the key to `genai.Client(api_key=...)`"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "metadata": {
+ "id": "z8YHJRqpuDaX"
+ },
+ "outputs": [],
+ "source": [
+ "import os\n",
+ "from google.genai import Client"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "metadata": {
+ "id": "cFwsaWb4t2zq"
+ },
+ "outputs": [],
+ "source": [
+ "os.environ[\"GEMINI_API_KEY\"] = \"AIzaSyBZ2C3q1RVnMTGQShB_HGrWemrAAEQYOxU\"\n",
+ "\n",
+ "# Make sure to replace the GEMINI_API_KEY with your api key accordingly."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 4,
+ "metadata": {
+ "id": "xSgkZKN5tr0u"
+ },
+ "outputs": [],
+ "source": [
+ "llm_client = Client()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "3FSrgmEPuC4R"
+ },
+ "source": [
+ "## Define the Memory Configuration\n",
+ "\n",
+ "To save and retrieve memory as context, we need an embedding model, a vector store for storage, and an LLM to summarize and store preferences. We define all three, LLM, vector store, and embedder, in a single unified config. This allows the Memory client to use a custom setup instead of the default OpenAI models, which we now replace with Gemini.\n",
+ "\n",
+ "You will use:\n",
+ "\n",
+ "- LLM: Gemini 2.5 Flash Lite [this is for Memory preference only]\n",
+ "- Vector Store: Qdrant\n",
+ "- Embeddings: Gemini Embedding with 768 dims\n",
+ "\n",
+ "## Why use a Vector Store?\n",
+ "\n",
+ "Because Mem0 has a default embedding_model_dims of 1536, and with the open source models you are using, you need to modify this embedding dimension with your own custom integrations. Since we have the ``models/text-embedding-004`` with 768 embedding dimensions, we need to define the vector store config and update: ``embedding_model_dims``\n",
+ "\n",
+ "### Get your vector database credentials:\n",
+ "\n",
+ "- Create your [Qdrant Cloud](https://cloud.qdrant.io/) account.\n",
+ "- After signing in, create a free cluster by deploying it on a GCP instance in the us-east region or any other region of your choice.\n",
+ "- Save the endpoint URL that ends with port 6333 along with the API key."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 5,
+ "metadata": {
+ "id": "i2VfW98quHOs"
+ },
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "/usr/local/lib/python3.12/dist-packages/jupyter_client/session.py:203: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).\n",
+ " return datetime.utcnow().replace(tzinfo=utc)\n"
+ ]
+ }
+ ],
+ "source": [
+ "from mem0 import Memory"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 9,
+ "metadata": {
+ "id": "M7HM3ImNSpBd"
+ },
+ "outputs": [],
+ "source": [
+ "PROVIDER = \"gemini\"\n",
+ "LLM_MODEL = \"gemini-2.5-flash-lite\"\n",
+ "EMBEDDER_MODEL = \"models/text-embedding-004\""
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 10,
+ "metadata": {
+ "id": "Y_bugzEFNjCi"
+ },
+ "outputs": [],
+ "source": [
+ "QDRANT_API_KEY = \"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhY2Nlc3MiOiJtIn0.9MD78mCuvtIL3XQm09jmFYE74Cqe8hXhVvTffiTnE1M\"\n",
+ "QDRANT_URL = \"https://8b6922de-74fc-469d-8e92-29df39c723b2.us-east4-0.gcp.cloud.qdrant.io:6333\"\n",
+ "COLLECTION_NAME = \"gemini-memory-v1\""
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 11,
+ "metadata": {
+ "id": "jWcToGSBtUah"
+ },
+ "outputs": [],
+ "source": [
+ "config = {\n",
+ " \"llm\": {\n",
+ " \"provider\": PROVIDER,\n",
+ " \"config\": {\n",
+ " \"model\": LLM_MODEL,\n",
+ " \"temperature\": 0.7,\n",
+ " }\n",
+ " },\n",
+ " \"vector_store\": {\n",
+ " \"provider\": \"qdrant\",\n",
+ " \"config\": {\n",
+ " \"url\": QDRANT_URL,\n",
+ " \"api_key\": QDRANT_API_KEY,\n",
+ " \"collection_name\": COLLECTION_NAME,\n",
+ " \"embedding_model_dims\": 768,\n",
+ " }\n",
+ " },\n",
+ " \"embedder\": {\n",
+ " \"provider\": PROVIDER,\n",
+ " \"config\": {\n",
+ " \"model\": EMBEDDER_MODEL\n",
+ " }\n",
+ " }\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 12,
+ "metadata": {
+ "id": "jZSojDmgu614"
+ },
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "/usr/local/lib/python3.12/dist-packages/jupyter_client/session.py:203: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).\n",
+ " return datetime.utcnow().replace(tzinfo=utc)\n"
+ ]
+ }
+ ],
+ "source": [
+ "client = Memory.from_config(config)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "73bx2I2gOHLD"
+ },
+ "source": [
+ "## Add initial preference to the Memory\n",
+ "\n",
+ "Every user interaction carries a hint of preference — what they said, how they said it, and what truly matters to them. We capture this and store it in the Memory so the Agent can recall it later.\n",
+ "\n",
+ "The goal is to make responses personalized instead of generic. When you ask something related to your preferences, the Agent refers to the saved memory before generating a reply.\n",
+ "\n",
+ "For example, if your preference for Indian Jain food is stored, then every time you ask for restaurant recommendations while traveling, a Gemini-based chatbot will suggest only places that serve Jain food rather than random options."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 13,
+ "metadata": {
+ "id": "K2FJA_HHu_Qs"
+ },
+ "outputs": [],
+ "source": [
+ "messages = [\n",
+ " {\"role\": \"user\", \"content\": \"What is the must try food in India\"},\n",
+ " {\"role\": \"assistant\", \"content\": \"You should try street foods like chaat and regional dishes such as biryani or dosa.\"},\n",
+ " {\"role\": \"user\", \"content\": \"I'm not into street food, I prefer proper Indian thalis or Jain food\"},\n",
+ " {\"role\": \"assistant\", \"content\": \"Then you should try Gujarati or Rajasthani thalis, they offer a complete and authentic Indian meal experience with Jain meal\"},\n",
+ "]\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 14,
+ "metadata": {
+ "id": "TG-SwrF7vAV1"
+ },
+ "outputs": [],
+ "source": [
+ "result1 = client.add(messages, user_id=\"personal\", metadata={\"category\": \"food\"})"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 15,
+ "metadata": {
+ "id": "U5rONV9aBr8d"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "{'results': [{'id': '6f13b1d8-b3e8-425e-a8d8-071ad362fb3e', 'memory': 'Prefers Indian thalis or Jain food over street food', 'event': 'ADD'}]}\n"
+ ]
+ }
+ ],
+ "source": [
+ "print(result1)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "l6DVbcy3RHj8"
+ },
+ "source": [
+ "Once you execute client.add, it generates a memory attribute with hash-id that is used to link to your preference in the knowledge graph nodes. Additionally to memory, you also have event attribute, that denotes: ADD, UPDATE, and DELETE action with respect to the update in the user preference within the memory component."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 16,
+ "metadata": {
+ "id": "JXKwVHFY-Wwi"
+ },
+ "outputs": [],
+ "source": [
+ "messages2 = [\n",
+ " {\"role\": \"user\", \"content\": \"I'm planning to travel to Hong Kong which Airlines to use from Bangalore\"},\n",
+ " {\"role\": \"assistant\", \"content\": \"Cathay Pacific is the best option and have the direct flights. Any preferences?\"},\n",
+ " {\"role\": \"user\", \"content\": \"Yes, I need Hindu Vegetarian meal and prefer window seat or person seat\"},\n",
+ " {\"role\": \"assistant\", \"content\": \"Thank you, I have noted down you prefer Vegan and aisle or window seat.\"},\n",
+ "]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 17,
+ "metadata": {
+ "id": "MLhvuDbw-YF7"
+ },
+ "outputs": [],
+ "source": [
+ "result2 = client.add(messages2, user_id=\"personal\", metadata={\"category\": \"travel\"})"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 18,
+ "metadata": {
+ "id": "sHl_Mn3-nxkq"
+ },
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "{'results': [{'id': '276c308a-b7dd-4dc8-bf0d-e3e3111d6f18',\n",
+ " 'memory': 'Planning to travel to Hong Kong from Bangalore',\n",
+ " 'event': 'ADD'},\n",
+ " {'id': '9e52ba8d-7e90-4eed-93f4-73b870f6994d',\n",
+ " 'memory': 'Needs Hindu Vegetarian meal',\n",
+ " 'event': 'ADD'},\n",
+ " {'id': 'ce60b1c2-4db3-49ab-8ad5-62d1d07a9ed9',\n",
+ " 'memory': 'Prefers window seat or aisle seat',\n",
+ " 'event': 'ADD'}]}"
+ ]
+ },
+ "execution_count": 18,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "result2"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "fCONHuBvTphm"
+ },
+ "source": [
+ "Now the memory also knows that I prefer window seat or aisle seat and prefer Hindu Vegetarian Meal while travelling. This becomes where critical when working in recommendation systems, where personalized response are provided more weightage compared to generic response."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "TjZE1kTcvIDo"
+ },
+ "source": [
+ "## Search - Inference on new suggestion\n",
+ "\n",
+ "To fetch the relevant context from memory, we just have to use the search function along with the user_id used to save the interactions."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 19,
+ "metadata": {
+ "id": "4cUZ1Z4Eyw13"
+ },
+ "outputs": [],
+ "source": [
+ "query = \"I am travelling to New york, suggest food places to try\""
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 20,
+ "metadata": {
+ "id": "LB9T2Ac7zJ6k"
+ },
+ "outputs": [],
+ "source": [
+ "memories = client.search(query,user_id=\"personal\",limit=30)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 21,
+ "metadata": {
+ "id": "Z6rE0cuiqqJo"
+ },
+ "outputs": [],
+ "source": [
+ "context = \"\\n\".join(f\"- {m['memory']}\" for m in memories['results'])"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 22,
+ "metadata": {
+ "id": "WumRhviWujp5"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "- Prefers Indian thalis or Jain food over street food\n",
+ "- Needs Hindu Vegetarian meal\n",
+ "- Planning to travel to Hong Kong from Bangalore\n",
+ "- Prefers window seat or aisle seat\n"
+ ]
+ }
+ ],
+ "source": [
+ "print(context)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "nVmLPZCwUU-F"
+ },
+ "source": [
+ "> So far, we only have Context, now its time to pass this context to the Large language model i.e.,``gemini-2.5-pro``.\n",
+ "> To make this better and personalized have a solid SYSTEM PROMPT that can adapt to the role play of an Expert Executive Assistant."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "ovMAQQRu9IEA"
+ },
+ "source": [
+ "## Generate LLM Response using Gemini 2.5 Pro"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "PSAMlRTWUpBc"
+ },
+ "source": [
+ "Now, all the interactions you saved, like your previous food preferences, will come into play. Next time you travel to a new place, say New York, and ask for restaurant suggestions, it will prioritize your preferences instead of giving generic recommendations."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 28,
+ "metadata": {
+ "id": "YJPwG5WI-JHQ"
+ },
+ "outputs": [],
+ "source": [
+ "SYSTEM_PROMPT = \"\"\"\n",
+ " You are an expert executive assistant who thinks carefully before responding,\n",
+ " adapting to the poliet communication style based on the previous user's established PREFERENCES and the complexity of their query.\n",
+ "\n",
+ " Maintain a polished, professional tone that is warm yet efficient and concise for\n",
+ " simple questions, moderate for complex topics, and comprehensive for open-ended discussions.\n",
+ "\n",
+ " Act as a trusted advisor who doesn't just answer questions but adds value through insights, anticipates needs,\n",
+ " and prioritizes what matters most while respecting the user's time with clear, actionable concise responses.\n",
+ " \"\"\""
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 29,
+ "metadata": {
+ "id": "AzTK6r029KHX"
+ },
+ "outputs": [],
+ "source": [
+ "def get_llm_response(query: str, user_id: str) -> str:\n",
+ " # first extract the context out of Mem0 - memory results\n",
+ "\n",
+ " memories = client.search(query,user_id=user_id,limit=30)\n",
+ " mem_results = memories['results']\n",
+ " context = \"\\n\".join(f\"- {m['memory']}\" for m in mem_results)\n",
+ "\n",
+ " USER_PROMPT = f\"\"\"\n",
+ " \n",
+ " QUESTION: {query}\n",
+ " \n",
+ "\n",
+ " \n",
+ " Preference: {context}\n",
+ " \n",
+ " \"\"\"\n",
+ "\n",
+ " # Config the system prompt and make sure to define the input variables inside the USER PROMPT\n",
+ " response = llm_client.models.generate_content(\n",
+ " model=\"gemini-2.5-pro\",\n",
+ " contents=USER_PROMPT,\n",
+ " config={\n",
+ " \"system_instruction\": SYSTEM_PROMPT\n",
+ " }\n",
+ " )\n",
+ " return response.text"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 30,
+ "metadata": {
+ "id": "GmvWPIaM-Y-p"
+ },
+ "outputs": [],
+ "source": [
+ "user_query = \"i need food and place recommendation for the food in New York\""
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 31,
+ "metadata": {
+ "id": "51ccG1Ng_u7d"
+ },
+ "outputs": [],
+ "source": [
+ "response = get_llm_response(user_query, user_id=\"personal\")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 32,
+ "metadata": {
+ "id": "y0hriqs1vjot"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Of course. It would be my pleasure to provide some dining recommendations in New York that align perfectly with your preference for Hindu vegetarian meals, particularly thalis and Jain options.\n",
+ "\n",
+ "Based on your tastes, I would prioritize these establishments known for their authenticity and quality.\n",
+ "\n",
+ "### Top Recommendations for Indian Vegetarian Dining in NYC\n",
+ "\n",
+ "1. **Vatan NYC**\n",
+ " * **What it is:** An exceptional all-you-can-eat, prix-fixe Gujarati thali experience. The ambiance is designed to resemble a traditional Indian village.\n",
+ " * **Why you'll like it:** This directly meets your preference for thalis in a wonderful, immersive setting. It is entirely vegetarian, and they are well-versed in accommodating Jain dietary needs upon request.\n",
+ " * **Location:** Murray Hill, Manhattan (an area often called \"Curry Hill\").\n",
+ "\n",
+ "2. **Saravanaa Bhavan**\n",
+ " * **What it is:** A globally recognized and highly reliable chain for authentic South Indian vegetarian cuisine.\n",
+ " * **Why you'll like it:** They offer excellent South Indian thalis, dosas, and a vast menu of traditional dishes. It’s a trusted choice for purity and taste.\n",
+ " * **Location:** Multiple locations, with a prominent one in Murray Hill, Manhattan.\n",
+ "\n",
+ "3. **Temple Canteen (Ganesh Temple Canteen)**\n",
+ " * **What it is:** Located in the basement of the Hindu Temple Society of North America, this legendary canteen serves fresh, authentic, and incredibly well-priced South Indian food.\n",
+ " * **Why you'll like it:** For a truly authentic and no-frills experience, this is unmatched. It is pure vegetarian, and the quality is exceptional. While not a formal restaurant, it is far from \"street food.\"\n",
+ " * **Location:** Flushing, Queens (worth the trip for the experience).\n",
+ "\n",
+ "**A quick note:** For Vatan, and even for Saravanaa Bhavan on a weekend evening, making a reservation is highly recommended.\n",
+ "\n",
+ "Please let me know if any of these pique your interest, and I can provide more details or assist with reservations.\n"
+ ]
+ }
+ ],
+ "source": [
+ "print(response)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "wLkFC7clVBnc"
+ },
+ "source": [
+ "If you check the response, its mainly focused on my preference rather than the base knowledge of the LLM."
+ ]
+ }
+ ],
+ "metadata": {
+ "colab": {
+ "name": "Long_Term_Memory.ipynb",
+ "toc_visible": true
+ },
+ "kernelspec": {
+ "display_name": "Python 3",
+ "name": "python3"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}