RunnableWithMessageHistory to output source documents in RAG #20041
Replies: 3 comments 3 replies
-
Hey @alexmondaini, great to see you diving deeper into LangChain! How's everything going on your end? To modify your implementation of Here's a strategy to achieve this:
Here is an illustrative example based on the provided context: # Step 1: Modify the wrapped Runnable to return a dictionary with both answer and source document
class CustomRunnable(Runnable):
async def invoke(self, input: Any, config: Optional[RunnableConfig] = None) -> dict:
# Your logic here to generate answer and source document
answer = "This is the answer."
source_document = "This is the source document."
return {"answer": answer, "source_document": source_document}
# Initialize the CustomRunnable
custom_runnable = CustomRunnable()
# Step 2: Wrap the CustomRunnable with RunnableWithMessageHistory
# Specify 'answer' as the output_messages_key since it contains the answer part of the output
runnable_with_history = RunnableWithMessageHistory(
custom_runnable,
get_session_history=get_by_session_id, # Assuming get_by_session_id is defined as in the context
output_messages_key="answer", # Key in the output dict that contains the answer
)
# Invoke the runnable_with_history and access both the answer and the source document
output = runnable_with_history.invoke(input_data, config)
answer = output["answer"] # This is extracted and processed by _get_output_messages
source_document = output["source_document"] # Directly access the source document from the output In this example, For more details, you can refer to the RunnableWithMessageHistory class in the LangChain codebase. Sources
|
Beta Was this translation helpful? Give feedback.
-
@alexmondaini Were you able to generate sources? I don't think the above implementation from dosubot helps |
Beta Was this translation helpful? Give feedback.
-
Hi @DhavalThkkar , I was able to generate sources by applying concatenation of strings from the dictionary lookup entries as in:
However the only catch here is that this will also return documents for a greetings message in a chatbot, which makes it odd when a user says |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I would like to output the source document as part of the answer when using RunnableWithMessageHistory, from the class signature I noticed only one key as in
output_messages_key
can be defined, so how would I be ale to also return the source document in this case ?System Info
langchain @ file:///home/conda/feedstock_root/build_artifacts/langchain_1711012871747/work
langchain-cli==0.0.21
langchain-community @ file:///home/conda/feedstock_root/build_artifacts/langchain-community_1711008261310/work
langchain-core @ file:///home/conda/feedstock_root/build_artifacts/langchain-core_1711659762823/work
langchain-openai @ file:///home/conda/feedstock_root/build_artifacts/langchain-openai_1710403027641/work
langchain-text-splitters @ file:///home/conda/feedstock_root/build_artifacts/langchain-text-splitters_1709389732771/work
langchainhub==0.1.15
Beta Was this translation helpful? Give feedback.
All reactions