Bedrock Retriever / Synthesizer error #15608
Replies: 1 comment 1 reply
-
Hey there, @brandon-orkestus! I'm here to help you out with any bugs or questions you have while waiting for a human maintainer. If you need assistance, feel free to ask. The error you're encountering, Here's how you can modify your code to ensure the
Here's the updated code: LLM_NAME = "anthropic.claude-3-sonnet-20240229-v1:0"
NUM_RESULTS = 10
MAX_TOKENS = 4_096
retriever = AmazonKnowledgeBasesRetriever(
knowledge_base_id=aws_knowledge_base_id,
retrieval_config={"vectorSearchConfiguration": {"numberOfResults": NUM_RESULTS}},
)
retrieved_results = retriever.retrieve(query)
llm = Bedrock(model=LLM_NAME, temperature=temperature, max_tokens=MAX_TOKENS)
response_synthesizer = get_response_synthesizer(response_mode=ResponseMode.COMPACT, llm=llm, verbose=True)
# Format the query as a list of dictionaries
formatted_query = [{"role": "user", "content": query}]
response = str(response_synthesizer.synthesize(formatted_query, retrieved_results)) This should resolve the |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm following the example here https://docs.llamaindex.ai/en/stable/examples/retrievers/bedrock_retriever/
This is my code
The last line gives me this error
Does anybody else facing this issue?
Im using this dependencies
Beta Was this translation helpful? Give feedback.
All reactions