Replies: 2 comments 8 replies
-
🤖 Hello @minnahu23! 👋 I'm Dosu, a friendly bot here to assist while we await a human maintainer. I'm equipped to help solve bugs, answer questions, and guide you to contribute to this project. Let's get your issue sorted out! The error message you're encountering, "ValueError: Prompt must accept context as an input variable", indicates that the prompt configuration you're using doesn't include "context" as an input variable. In LangChain, the "context" variable is used to provide the history or context of the conversation to the language model. However, looking at your code, it seems like you're not using the "context" variable in your prompt configurations. Instead, you're using "chat_history" and "input" as your input variables. This is likely the cause of the error. To resolve this issue, you need to add "context" to the list of input variables in your prompt configuration. Here's how you can modify your prompt = ChatPromptTemplate.from_messages(
[
SystemMessage(
content="Answer the user's questions based on the below context:\n\n{context}"
), # The persistent system prompt
MessagesPlaceholder(
variable_name="context"
), # Where the memory will be stored.
HumanMessagePromptTemplate.from_template(
"{input}"
), # Where the human input will injected
]
) In this modified configuration, the "context" variable is included in the list of input variables and is also used in the template. This will allow the prompt to accept the context of the conversation as an input. Please try this modification and let me know if it resolves your issue. If you encounter any other problems, feel free to ask. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@dosu-bot thank you, here is the code for retriever_chain, thank you loader = UnstructuredFileLoader("./knowledge.txt") prompt_retriever = ChatPromptTemplate.from_messages([ |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to run the example on Quickstart page, a history aware retriever, https://python.langchain.com/docs/get_started/quickstart, but get an error: ValueError: Prompt must accept context as an input variable. Received prompt with input variables: ['chat_history', 'input']
How do I fix it? Thank you!
Here is my code:
loader = UnstructuredFileLoader("./knowledge.txt")
docs = loader.load()
print(docs)
documents = RecursiveCharacterTextSplitter(
chunk_size=1000, chunk_overlap=200
).split_documents(docs)
vector = FAISS.from_documents(documents, OpenAIEmbeddings())
retriever = vector.as_retriever()
Beta Was this translation helpful? Give feedback.
All reactions