How to use ConversationSummaryMemory in a RunnableSequence? #4966
-
Checked other resources
Commit to Help
Example Codeconst model = new ChatOpenAI({
temperature: 0,
verbose: false,
});
const chatHistory = new RedisChatMessageHistory({
sessionId: `chat_history/${chatId}`,
sessionTTL: parseInt(process.env.REDIS_TTL || "86400"),
client: redis,
});
const memory = new ConversationSummaryMemory({
chatHistory: chatHistory,
llm: model,
memoryKey: "history",
inputKey: "question",
outputKey: "answer",
returnMessages: false,
});
const chain = RunnableSequence.from([
ChatPromptTemplate.fromMessages([
SystemMessagePromptTemplate.fromTemplate(`
${PROMPT}
------------
CHAT HISTORY: {history}
`),
HumanMessagePromptTemplate.fromTemplate(`
QUESTION: {question}
`),
]),
model,
new StringOutputParser(),
]);
const main = RunnableSequence.from([
// Gets the history of the conversation
RunnablePassthrough.assign({
mem: async () => memory.loadMemoryVariables({}),
}),
RunnablePassthrough.assign({
history: (input: any) => input.mem[memory.memoryKeys[0]],
}),
RunnablePassthrough.assign({
answer: chain,
}),
]);
const outputs = await main.invoke({ question: "Example question here...." });
await memory.saveContext(inputs, outputs); DescriptionI am having a hard time implementing a ConversationSummaryMemory in a RunnableSequence. I would like the history variable to contain the summary of the conversation so far, but everytime I run this code, I can see that although the chatHistory contains the previous messages, the summary is always empty. What is the correct way to use it? Thank you! System Info[email protected] | MIT | deps: 17 | versions: 261 |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
Hey @mauriciocirelli! After diving into the details of your implementation and considering the behavior you're experiencing, it seems like the issue might be related to how the From your description, it appears that the Here are a few steps and considerations to ensure
Given the complexity of integrating If after these steps the issue persists, it might be helpful to reach out to the LangChain community or check if there are any known issues related to I hope these suggestions help you move forward with your implementation. If you have any more questions or need further assistance, feel free to reach out!
|
Beta Was this translation helpful? Give feedback.
-
Hi! I have checked steps (1) and (2). However, I do not want a list of previous messages. I need the summary of the conversation. Looking at LangSmith, it seems that the summary is being built when I call I have confirmed this behavior: logger.info(JSON.stringify(await memory.loadMemoryVariables({}))); // Empty!
const inputs = { question: question };
const outputs = await chain.invoke(inputs);
await memory.saveContext(inputs, outputs);
logger.info(JSON.stringify(await memory.loadMemoryVariables({}))); // Summary is available here! Is there a way to make it build the summary before calling the chain? |
Beta Was this translation helpful? Give feedback.
-
Hi! I managed to get this to work by adding the following RunnableSequence to my chain: const memoryChain = RunnableSequence.from([
RunnablePassthrough.assign({
msgs: async () => chatHistory.getMessages(),
}),
async (input: any) => memory.predictNewSummary(input.msgs, ""),
]);
// Gets the history of the conversation
RunnablePassthrough.assign({
history: memoryChain,
}), |
Beta Was this translation helpful? Give feedback.
Hi!
I managed to get this to work by adding the following RunnableSequence to my chain: