Skip to content

Commit 66d236c

Browse files
authored
fix: is_async not passed for llm generation in context recall (#777)
In `_context_recall.py` the `is_async` flag is not passed to the `llm.generate` function, like it is done for the other metrics which leads to problems when utilising other APIs then OpenAI. Example: ``` from ragas.metrics import context_recall from datasets import Dataset from ragas import evaluate import os os.environ["OPENAI_API_KEY"] = "" os.environ["RAGAS_DO_NOT_TRACK"] = "true" questions = [ "Where is France and what is it’s capital?", ] ground_truths = [ "France is in Western Europe and its capital is Paris.", ] contexts =[ [ "The country is also renowned for its wines and sophisticated cuisine. Lascaux’s ancient cave drawings, Lyon’s Roman theater and", "The country is also renowned for its wines and sophisticated cuisine. Lascaux’s ancient cave drawings, Lyon’s Roman theater and", ] ] data = { "question": questions, "contexts": contexts, "ground_truth": ground_truths } # Convert dict to dataset dataset = Dataset.from_dict(data) dataset # OpenAI works result = evaluate( dataset = dataset, metrics=[ context_recall, ], ) print(result.to_pandas()) # Bedrock does not work os.environ['AWS_ACCESS_KEY_ID'] = '' os.environ['AWS_SECRET_ACCESS_KEY'] = '' from langchain_community.llms.bedrock import Bedrock llm = Bedrock(model_id="anthropic.claude-v2:1") result = evaluate( dataset = dataset, metrics=[ context_recall, ], llm=llm ) # not working, timeout, error: "Streaming must be set to True for async operations." print(result.to_pandas()) ```
1 parent da5e981 commit 66d236c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/ragas/metrics/_context_recall.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -121,7 +121,7 @@ async def _ascore(self, row: t.Dict, callbacks: Callbacks, is_async: bool) -> fl
121121
assert self.llm is not None, "set LLM before use"
122122

123123
result = await self.llm.generate(
124-
self._create_context_recall_prompt(row), callbacks=callbacks
124+
self._create_context_recall_prompt(row), callbacks=callbacks, is_async=is_async
125125
)
126126
response = await json_loader.safe_load(
127127
result.generations[0][0].text, self.llm, is_async=is_async

0 commit comments

Comments
 (0)