Replies: 1 comment
-
🤖 Hello, To incorporate conversational memory into your custom tool that uses a LLM Chain without making it global, you can use the Here is an example of how you can use it: from langchain.chains import ConversationChain
from langchain.llms import OpenAI
from langchain.memory.zep_memory import ZepMemory
# Create a new instance of ZepMemory
memory = ZepMemory(
session_id=session_id, # Identifies your user or a user's session
url=ZEP_API_URL, # Your Zep server's URL
api_key=<your_api_key>, # Optional
memory_key="history", # Ensure this matches the key used in chain's prompt template
return_messages=True, # Does your prompt template expect a string or a list of Messages?
)
# Create a new instance of ConversationChain with OpenAI as the language model and ZepMemory as the memory store
conversation = ConversationChain(llm=OpenAI(), memory=memory) In this example, The You can find more details about the I hope this helps! If you have any other questions, feel free to ask. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have custom tool which internally uses a LLM Chain. However I want to use conversational memory within the tool. How to pass the memory to tool. I would not like to have the conversational memory as global.
I passed the memory in Tool.from_function call and it threw error that memory argument is invalid.
Beta Was this translation helpful? Give feedback.
All reactions