Replies: 1 comment
-
Answer generated by a 🤖 AnswerHello Perryn, Thank you for reaching out with your questions. I understand that you're having issues with the MapReduce function in LangChain, specifically with it getting overwhelmed by repeated 'no relevant info' sentences and not correctly identifying the relevant answer. You're also looking for a way to override the prompt in a To address your first question, it seems like this could be a common problem with the current implementation of MapReduce. However, without more information about your specific use case, it's hard to say for sure. I did find a similar issue in the LangChain repository titled Good summaries were ignored, but it was closed without any comments, so it's unclear if the issue was resolved or not. To prevent the MapReduce function from getting overwhelmed by repeated 'no relevant info' sentences, you could modify the As for your second question, to override the prompt in a Here is the modified @classmethod
def from_params(
cls,
llm: BaseLanguageModel,
prompt: BasePromptTemplate, # Pass your new prompt here
text_splitter: TextSplitter,
callbacks: Callbacks = None,
combine_chain_kwargs: Optional[Mapping[str, Any]] = None,
reduce_chain_kwargs: Optional[Mapping[str, Any]] = None,
**kwargs: Any,
) -> MapReduceChain:
"""Construct a map-reduce chain that uses the chain for map and reduce."""
llm_chain = LLMChain(llm=llm, prompt=prompt, callbacks=callbacks) # The prompt is used here
# ... rest of the code Please note that the provided context does not include the I hope this helps! If you have any further questions or need more clarification, please don't hesitate to ask. Best, SourcesThis response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Greetings,
I am trying to do RAG using Map Reduce over the retrieved content to present a final question to chatGPT.
For some questions, my retrieval evidently finds the right answer but also retrieves some irrelevant stuff, and the final question looks like this
"prompts": [ "System: Given the following extracted parts of a long document and a question, create a final answer. \nIf you don't know the answer, just say that you don't know. Don't try to make up an answer.\n______________________\nUDJT9JTPV: Team avocado is currently working on modernized onboarding via QR codes that will be generated in the app.\n\nThere is no relevant information in the given portion of the document to answer the question about what team avocado is working on.\n\nThere is no mention of \"team avocado\" or their current projects in the provided conversation.\n\nThere is no information in the given portion of the document about what team avocado is working on.\n\nThere is no relevant information in the given portion of the document to answer the question about what team avocado is working on.\nHuman: <@U05KFTB89PC> what is team avocado working on?" ]
You will note that the answer is in there, but chatGPT seems to get overwhelmed by the repeated sentences that 'there is no relevant info' and responds with
I'm sorry, but there is no relevant information in the given portion of the document to answer the question about what team avocado is working on.
I have two questions:
This seems like it would be a common problem with the current implementation of MapReduce? Do other people have this issue, or am I doing something wrong?
One tack I'm trying to address this by adjusting this prompt to include something like 'if there is no relevant information then reply with an empty string', but I can't seem to figure out how to override this prompt when I am using a ConversationalRetrievalChain with a chain_type of 'map_reduce'
langchain/libs/langchain/langchain/chains/qa_with_sources/map_reduce_prompt.py
Line 4 in 3f64b8a
Thanks for any advice
perryn
Beta Was this translation helpful? Give feedback.
All reactions