Replies: 1 comment
-
🤖 Hello, The error message "document_variable_name context was not found in llm_chain input_variables" suggests that the variable name you're using in your prompt template doesn't match the variable name you're passing to the In your case, you're using the variable To resolve this issue, you should ensure that all the variables you're using in your QA Chain are included in the QA_CHAIN_PROMPT = PromptTemplate(
input_variables=["question", "document_variable_name"],
template=template,
) This way, the Also, make sure that the Here are some relevant sources from the LangChain repository that might help you understand this better:
I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to add a specific prompt template to my QA Chain (RetrievalQA) so I can specify how the model will behave the answer. This is my current code:
This is the error I'm currently obtaining:
Can't I define the Prompt Template with the variables I specify? Do I need to specify a context or? What is currently failing?
Beta Was this translation helpful? Give feedback.
All reactions