Skip to content
Discussion options

You must be logged in to vote

@hodgesz Let me save you some headache :)

 qa = ConversationalRetrievalChain.from_llm(
        llm=llm,
        chain_type="stuff",
        retriever=doc_db.as_retriever(),
        memory=memory,
        verbose=True,
        condense_question_prompt=prompt,
        max_tokens_limit=4097
    )

Here you are setting condense_question_prompt which is used to generate a standalone question using previous conversation history.

What you want to do is:

qa = ConversationalRetrievalChain.from_llm(
        llm=llm,
        chain_type="stuff",
        retriever=doc_db.as_retriever(),
        memory=memory,
        verbose=True,
#        condense_question_prompt=prompt,
        max_tokens_limit=4097…

Replies: 4 comments 2 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@hodgesz
Comment options

@hodgesz
Comment options

Comment options

You must be logged in to vote
0 replies
Answer selected by hodgesz
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants