How to store chat history with query engine? #15146
Unanswered
d3buggerdan
asked this question in
Q&A
Replies: 2 comments 11 replies
-
To store chat history with your query engine setup, you can use the
Here's an example of how you can achieve this: from llama_index.legacy.core.llms.types import ChatMessage, MessageRole
from llama_index.legacy.memory import ChatMemoryBuffer
from llama_index.legacy.service_context import ServiceContext
from llama_index.legacy.llm_predictor.base import LLMPredictorType
from llama_index.legacy.core.base_query_engine import BaseQueryEngine
from llama_index.legacy.chat_engine import CondenseQuestionChatEngine
# Initialize the memory buffer
memory_buffer = ChatMemoryBuffer()
# Define your query engine, LLM predictor, and other necessary components
query_engine = BaseQueryEngine() # Replace with your actual query engine
llm_predictor = LLMPredictorType() # Replace with your actual LLM predictor
# Initialize the CondenseQuestionChatEngine
chat_engine = CondenseQuestionChatEngine(
query_engine=query_engine,
condense_question_prompt=PromptTemplate(DEFAULT_TEMPLATE),
memory=memory_buffer,
llm=llm_predictor,
verbose=True
)
# Example usage
user_message = "What is the weather like today?"
response = chat_engine.chat(user_message)
# Retrieve the chat history
chat_history = chat_engine.chat_history
for message in chat_history:
print(f"{message.role}: {message.content}") In this example:
This setup ensures that the conversation history is saved and can be referenced for maintaining the state of the chat [1]. |
Beta Was this translation helpful? Give feedback.
11 replies
-
Hi, did you get any luck resolving this problem? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
@dosu
I have the following query engine:
storage_context = StorageContext.from_defaults(vector_store=vector_store)
Settings.llm = llm
Settings.embed_model = embed_model
index = VectorStoreIndex.from_documents(
[], storage_context=storage_context
)
default_retriever = index.as_retriever(
vector_store_query_mode=VectorStoreQueryMode.HYBRID,
similarity_top_k=10,
)
response_synthesizer = get_response_synthesizer(
response_mode="tree_summarize",
streaming=True,
)
query_engine = RetrieverQueryEngine(
retriever=default_retriever,
response_synthesizer=response_synthesizer,
)
response = query_engine.query
and i want to save the conversation history similar to how the chatengine works. so for example i can ask a question, it will save it to memory and reference it for the state of the chat
Beta Was this translation helpful? Give feedback.
All reactions