custom prompt template with a retriever pipeline and agent with memory #5866
Closed
HGamalElDin
started this conversation in
General
Replies: 1 comment 2 replies
-
Hey @HGamalElDin, you basically want to have a conversational chat agent that uses RAG pipeline, right? If so, then I think this is possible if you use an RAG pipeline as a conversation agent tool. Let me check if there are some tutorials with exactly this use case. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello guys!
Please confirm if I can not implement the RAG pipeline with a chat memory (without using agents)?
Notes that might give you intuition on my case:
So here I have two variables.
Question 1: how to provide the documents variable to the agent you created in your comment?
I tried the following snippet, however, the default memory injected in the agent doesn't store the Human input at all.
When I call
conversational_agent.memory.load()
I got """Human: input\nAI: Paris is the capital of France.\nHuman: input\nAI: During Ramadan, working hours are 10:30 AM to 4:00 PM, Sunday to Thursday.\nHuman: input\nAI: France's capital is Paris.\n"""
Also, the prompt template I provided should control the responses that no random answers should be generated! only from the context or the response should be a specific message as provided! it works well if I only use the prompt node, but with agent It's like the prompt template doesn't work, what could be the possible cause of this?
Beta Was this translation helpful? Give feedback.
All reactions