How do I override PromtTemplate for ConversationalRetrievalChain.from_llm method? #6345
-
I am trying to create a support chatbot with our indexed knowledge base as embeddings in Pinecone. When I don't try to override the prompt template, it is functional, but I want to better control its responses, so I want to override it.
The first time I run the code, it correctly returns a summarized result from indexed knowledge base documents. However, on the 2nd run, it fails because context input key is null. I was assuming the Pinecone retriever was setting the context variable from the most similar documents to the passed in query or question from the user. That appears to happen the first time, but not the 2nd time. Any ideas? The following is the error and the output from the first successful run. Please ask a question: When was Userpilot founded?
Please ask a question: What is Userpilot used for?
|
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 2 replies
-
After further digging into this, it looks like even the successful run isn't using my custom prompt template:
This also doesn't look like the default template, so does anyone know where this is coming from? |
Beta Was this translation helpful? Give feedback.
-
Thanks for this question, I would also like to know how to use this! Waiting for answer, please tell me if you solved it |
Beta Was this translation helpful? Give feedback.
-
@hodgesz Let me save you some headache :)
Here you are setting condense_question_prompt which is used to generate a standalone question using previous conversation history. What you want to do is:
|
Beta Was this translation helpful? Give feedback.
-
@ShantanuNair I really appreciate your help! I was not understanding what was happening, but your response makes perfect sense now that I see it. |
Beta Was this translation helpful? Give feedback.
@hodgesz Let me save you some headache :)
Here you are setting condense_question_prompt which is used to generate a standalone question using previous conversation history.
What you want to do is: