Replies: 2 comments 2 replies
-
🤖 Hello, To customize different prompts for different parts of the map-reduce chain in the from langchain.prompts import PromptTemplate
# Original prompt
prompt_template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.
{context}
Question: {question}
Helpful Answer:"""
PROMPT = PromptTemplate(
template=prompt_template, input_variables=["context", "question"]
)
# Customized prompt for a specific part of the map-reduce chain
custom_prompt_template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.
{context}
Additional Context: {additional_context}
Question: {question}
Helpful Answer:"""
CUSTOM_PROMPT = PromptTemplate(
template=custom_prompt_template, input_variables=["context", "additional_context", "question"]
) In this example, To pass custom prompts to the Here's an example of how you can create a custom prompt and pass it to the from langchain.llms import OpenAI
from langchain.chains import RetrievalQA
from langchain.prompts import PromptTemplate
custom_prompt_template = """Here is some context:
{context}
Please answer the following question: {question}
"""
CUSTOM_PROMPT = PromptTemplate(
template=custom_prompt_template, input_variables=["context", "question"]
)
retrievalQA = RetrievalQA.from_llm(llm=OpenAI(), prompt=CUSTOM_PROMPT) In this example, I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Probably not the cleanest way, but used the following workaround to add customization to the retrievalQA chain with map-reduce:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to figure out how to pass custom instructions while using pre-built abstractions such as
RetrievalQA
. More specifically, I want the map-reduce or refine chain to analyze the documents keeping in mind some context. I know this should use the chain_type_kwargs along with PromptTemplate.from_template(template), but I'm unsure how to incorporate different custom prompts for different parts of the map reduce chain. For example, I want to change the"Use the following portion of a long document to see if any of the text is relevant to answer the question.
Return any relevant text verbatim."
by appending some context about the documents. Then, I want to change
"Given the following extracted parts of a long document and a question, create a final answer.
If you don't know the answer, just say that you don't know. Don't try to make up an answer."
by appending different variation of context.
Beta Was this translation helpful? Give feedback.
All reactions