chat keep repeating #9652
Replies: 2 comments
-
🤖 Hi Ken, Based on the information you've provided, it seems like you're trying to modify the prompt used in the If you want to change this prompt, you can do so by passing it in via ConversationalRetrievalChain.from_llm(combine_docs_chain_kwargs={"prompt": your_prompt}) Alternatively, you can use from langchain.chains.question_answering import load_qa_chain
template = """
{Your_Prompt}
CONTEXT:
{context}
QUESTION:
{query}
CHAT HISTORY:
{chat_history}
ANSWER:
"""
prompt = PromptTemplate(input_variables=["chat_history", "query", "context"], template=template)
memory = ConversationBufferMemory(memory_key="chat_history", input_key="query")
chain = load_qa_chain(ChatOpenAI(temperature=0), chain_type="stuff", memory=memory, prompt=prompt) In this example, These solutions were suggested in a similar issue in the LangChain repository: Issue: Changing Prompt (from Default) when Using ConversationalRetrievalChain? If you're still encountering issues after trying these solutions, could you please provide more details about the problem? Specifically, it would be helpful to know more about the exact error message you're seeing and the steps you're taking that lead to this error. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Thanks Dosu, it can work now! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I m trying to use huggingface model TheBloke/OpenOrca-Platypus2-13B-GGML
I managed to create chromadb for external knowledge ingestion and uses conversationalretrievalchain
It works but I often encounter repeatition of self question and answer
I suspect it maybe incorrect prompt. The model template is
Instruction:
{prompt}
Response:
And I try different combinations of prompt template but I still have the same problems. One of the prompt template I use is:
prompt_template = """Answer based on context
{chat_history}
Question: {question}"""
Can anyone help?
regards
Ken
Beta Was this translation helpful? Give feedback.
All reactions