Chat context/metadata in function calls #17386
Replies: 1 comment 1 reply
-
Yes, in LlamaIndex, you can achieve this by using the Here's an example of how you can set this up: from llama_index.memory.mem0.base import Mem0Context, Mem0Memory
# Create a context with a user_id
context = Mem0Context(user_id="your_user_id")
# Initialize Mem0Memory with the context
memory = Mem0Memory(context=context)
# Now, when you call methods like add or search, the user_id is automatically included
memory.add(messages="Your message here") This setup allows you to pass user-specific metadata, like To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'll try to ask my question with a scenario for my use-case:
Say I have a function
reserve_room
whose purpose is to fill-out a form through an API call and book a room in a hotel on a person's behalf. This tool is made available to a function-calling LLM with the intent of being able to reserve a hotel room in a chatbot. Now imagine that each registered user is assigned auser_id
and attached to thatuser_id
is that user's personal information (such as phone number, email, address, etc.). The API call thatreserve_room
has to make to actually reserve a room has many required fields (including the personal information attached to theuser_id
). Now, I do not want to explicitly ask for all of those required fields from the user, I want the function to be able to access that user'suser_id
without having to ask for the user to provide it to the LLM when they query the LLM on how to book a hotel room.More succinctly, I would like to call a function
reserve_room
through my LLM and I want to be able to pass some form of context/metadata information which I don't want to have to explicitly ask the user for(i.e, including said metadata as arguments to thereserve_room
function).For instance, something like this (without the LLM asking for
metadata
from the user):Instead of having to do this:
Is this possible using LlamaIndex (or at all)?
Beta Was this translation helpful? Give feedback.
All reactions