conversation with NeptuneOpenCypherQAChain #24974
Unanswered
aradhanachaturvedi
asked this question in
Q&A
Replies: 1 comment 12 replies
-
Yes, it is possible to maintain the history of the conversation with from langchain.memory import ConversationBufferMemory, ReadOnlySharedMemory
from langchain_core.prompts import PromptTemplate
from langchain_community.chains.graph_qa.cypher import NeptuneOpenCypherQAChain
from tests.unit_tests.llms.fake_llm import FakeLLM
memory = ConversationBufferMemory(memory_key="chat_history")
readonlymemory = ReadOnlySharedMemory(memory=memory)
prompt = PromptTemplate(
input_variables=["schema", "question", "chat_history"],
template="""You are a nice chatbot having a conversation with a human.
Schema:
{schema}
Previous conversation:
{chat_history}
New human question: {question}
Response:"""
)
llm = FakeLLM(queries={})
chain = NeptuneOpenCypherQAChain.from_llm(
cypher_llm=llm,
qa_llm=FakeLLM(),
graph=FakeGraphStore(),
verbose=True,
return_intermediate_steps=False,
cypher_llm_kwargs={"prompt": prompt, "memory": readonlymemory},
memory=memory,
)
chain.run("Test question")
chain.run("Test new question") This setup ensures that the bot can understand and respond to follow-up questions appropriately by maintaining the conversation history [1][2][3][4]. |
Beta Was this translation helpful? Give feedback.
12 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Hi I am trying to use NeptuneOpenCypherQAChain , but we want our bot to be conversational in nature. Is it possible with this QA chain to maintain history of conversation
System Info
langchain==0.1.20
langchain-community==0.0.38
langchain-core==0.1.52
langchain-experimental==0.0.57
langchain-openai==0.1.3
langchain-text-splitters==0.0.1
Beta Was this translation helpful? Give feedback.
All reactions