Replies: 1 comment 1 reply
-
so this is a work around I have been trying out is use the time function in python to get the latest time and do this : import time
current_timestamp = time.time()
formatted_timestamp = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(current_timestamp))
prompt_template = """
Have a conversation with a human, answering the following questions as best you can.
This is the CURRENT DATE AND YOU DONT KNOW THE TIME: """+formatted_timestamp+"""
#rest of the prompt ....
{chat_history}
#rest of the prompt ....
{question}
""" so I am just using simple concatination as a quick and easy workaround to tell the bot the current date and time. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi!
Working with ConversationalRetrievalChain and trying to get the LLM to be aware of what date and time it is right now, so that when it is fed with the info from the vector database it will know if the events in the returned documents are happening right now, in the past or in the future.
The chat is constructed like this:
qa = ConversationalRetrievalChain.from_llm(ChatOpenAI(model_name=CHAT_GPT_MODEL, temperature=0), vector_store.as_retriever( search_kwargs={"k": 14}), combine_docs_chain_kwargs={'prompt': prompt_template}, return_source_documents=True)
And the questions are then sent like this:
response = qa({"question": user_message, "chat_history": chat_history})
What is the best strategy to add current date and time to every question to the LLM? Is there a way to update the prompt template with the current date and time dynamically between every question without having to create a new qa object every time? Or is there some other way to sneak in the date and time to the LLM? Is there a way to put it in the context maybe?
Beta Was this translation helpful? Give feedback.
All reactions