How do I work with the BufferMemory? #1588
Unanswered
JosefKousal
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
So I have this code:
Given that this is BufferMemory, I'd expect it to collect all the messages and append them one after the other - including the System- and Humanmessages passed into the first chain.call. Instead, I only get the previous answer from the AI:
ChatMessageHistory:
What's the catch? What am I doing wrong? I tried an LLMChain and a ConversationChain, both to no avail.
I also tried adding
to the other prompt, which yielded the same results.
The context length should not be an issue, it's approx. 3000 tokens being fed into gpt-3.5-turbo (both requests and responses in total).
Beta Was this translation helpful? Give feedback.
All reactions