Replies: 1 comment 1 reply
-
Hey there, @DaX-523! 👋 I'm here to help you with bugs, answer questions, and even assist you in becoming a contributor. Let's tackle this issue together! The issue you're encountering is related to how the Here's a detailed explanation of what's happening and how to fix it: Explanation
SolutionTo fix the issue, ensure that the const ragChain = RunnableSequence.from([
RunnablePassthrough.assign({
context: async (input) => {
if ("chat_history" in input) {
const chain = contextualizedQuestion(input);
const context = await chain.pipe(retriever).pipe(formatDocumentsAsString).invoke(input);
return context;
}
return "";
},
}),
qaPrompt,
llm,
]); Key Changes
Full ExampleHere's the full corrected example: import "cheerio";
import { CheerioWebBaseLoader } from "@langchain/community/document_loaders/web/cheerio";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import { MemoryVectorStore } from "langchain/vectorstores/memory";
import { CohereEmbeddings, ChatCohere } from "@langchain/cohere";
import { pull } from "langchain/hub";
import { formatDocumentsAsString } from "langchain/util/document";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
import {
RunnableSequence,
RunnablePassthrough,
} from "@langchain/core/runnables";
import { StringOutputParser } from "@langchain/core/output_parsers";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
const loader = new CheerioWebBaseLoader(
"https://lilianweng.github.io/posts/2023-06-23-agent/"
);
const main = async () => {
const docs = await loader.load();
const textSplitter = new RecursiveCharacterTextSplitter({
chunkSize: 1000,
chunkOverlap: 200,
});
const splits = await textSplitter.splitDocuments(docs);
const store = await MemoryVectorStore.fromDocuments(
splits,
new CohereEmbeddings({
apiKey: process.env.COHERE_API_KEY,
model: "embed-english-v2.0",
})
);
const retriever = store.asRetriever();
const prompt = await pull("rlm/rag-prompt");
const llm = new ChatCohere({
apiKey: process.env.COHERE_API_KEY,
});
const qaSystemPrompt = `You are an assistant for question-answering tasks.
Use the following pieces of retrieved context to answer the question.
If you don't know the answer, just say that you don't know.
Use three sentences maximum and keep the answer concise.
{context}`;
const qaPrompt = ChatPromptTemplate.fromMessages([
["system", qaSystemPrompt],
new MessagesPlaceholder("chat_history"),
["human", "{question}"],
]);
const contextualizedQuestion = (input) => {
if ("chat_history" in input) {
return contextualizeQChain;
}
return input.question;
};
const ragChain = RunnableSequence.from([
RunnablePassthrough.assign({
context: async (input) => {
if ("chat_history" in input) {
const chain = contextualizedQuestion(input);
const context = await chain.pipe(retriever).pipe(formatDocumentsAsString).invoke(input);
return context;
}
return "";
},
}),
qaPrompt,
llm,
]);
const qaSystemTemplate = ChatPromptTemplate.fromMessages([
["system", qaSystemPrompt],
new MessagesPlaceholder("history"),
["human", "{input}"],
]);
const contextualizeQChain = qaSystemTemplate
.pipe(llm)
.pipe(new StringOutputParser());
let chat_history = [];
const q1 = await ragChain.invoke({
chat_history,
question: "What is task decomposition?",
});
console.log(1, q1);
chat_history.push(q1);
const q2 = await ragChain.invoke({
chat_history,
question: "What are common ways of doing it? And doing what exactly?",
});
console.log(q2);
chat_history.push(q2);
};
main(); This should resolve the issue and ensure that the |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am learning langchain and was following the tutorial on conversational RAG after a set of code i ran into this error (even on running the same code as mentioned in the tutorial.
After some debugging and experiments when is removed the 'return' when providing the context to the model it worked perfectly fine, for reference I changed
return chain.pipe(retriever).pipe(formatDocumentsAsString);
and removed return from here.
But, as the documentation says "Notice we add some routing functionality to only run the “condense question chain” when our chat history isn’t empty. Here we’re taking advantage of the fact that if a function in an LCEL chain returns another chain, that chain will itself be invoked.". It lays emphasis on returning the chain but it works when not returning the chain.
I wanted to know what is going on behind the scenes...
The error :
node:internal/process/promises:289
triggerUncaughtException(err, true /* fromPromise */);
^
Error: Missing value for input variable
context
System Info
linux
node : v20.9.0
npm : 10.1.0
Beta Was this translation helpful? Give feedback.
All reactions