RetrievalQA while using multiple input parameters in Promptemplate #10269
Replies: 2 comments 1 reply
-
🤖 Hello, From your code, it seems like you're trying to use the To use multiple input variables with the In your case, if you want to include both the output of vector stores and the standard API response, you can add these as new variables in the Here's an example of how you can do this: prompt_template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.
{context}
Vector Store Output: {vector_store_output}
API Response: {api_response}
Question: {question}
Helpful Answer:"""
PROMPT = PromptTemplate(
template=prompt_template, input_variables=["context", "vector_store_output", "api_response", "question"]
) In this example, You can find more information about how I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@dosu_bot how would I pass in the "vector_store_output" and "api_response" variables into the RetrievalQA chain called qa_chain? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I am having trouble with using multiple input variables with RetrievalQA chain.
My code is as below
The idea was that along with the output of vector stores, I wanted the LLM to take standard API response also into account. And maybe later modify it to include Websearch response with different weights to such responses.
Beta Was this translation helpful? Give feedback.
All reactions