Skip to content

AI Evaluation SDK Ignores Open Telemetry Parent Tracing ID #43613

@BenConstable9

Description

@BenConstable9
  • Package Name: azure-ai-evaluation
  • Package Version: 1.11.0
  • Operating System: Linux / Windows
  • Python Version: 3.12

Describe the bug
When using the inbuilt evaluators in combination with Open Telemetry OpenAI Instrumentation, parent trace id's are not recorded by the inbuilt evaluators.

To Reproduce
Steps to reproduce the behavior:

from opentelemetry.instrumentation.openai import OpenAIInstrumentor
from opentelemetry import trace
from azure.ai.evaluation import evaluate, RelevanceEvaluator

model_config = {
    "azure_endpoint": os.environ.get("AZURE_OPENAI_ENDPOINT"),
    "api_key": os.environ.get("AZURE_OPENAI_API_KEY"),
    "azure_deployment": os.environ.get("AZURE_OPENAI_DEPLOYMENT"),
}

relevance_evaluator = RelevanceEvaluator(model_config)

tracer = trace.get_tracer("Evaluations")
with tracer.start_as_current_span("TestEvaluations"):
  result = evaluate(
      data="data.jsonl", # provide your data here
      evaluators={
          "relevance": relevance_evaluator
      },
      # column mapping
      evaluator_config={
          "relevance": {
              "column_mapping": {
                  "query": "${data.queries}"
                  "ground_truth": "${data.ground_truth}"
                  "response": "${outputs.response}"
              } 
          }
      }
      # Optionally provide your AI Foundry project information to track your evaluation results in your Azure AI Foundry project
      azure_ai_project = azure_ai_project,
      # Optionally provide an output path to dump a json of metric summary, row level data and metric and AI Foundry URL
      output_path="./evaluation_results.json"
  )

Expected behavior
Expect the span to be attached to any traces emitted by the OpenAI client inside of evaluate.

Instead, traces contain no parent ids so they do not show up in application insights correctly in a timeline.

Screenshots
If applicable, add screenshots to help explain your problem.

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    ClientThis issue points to a problem in the data-plane of the library.EvaluationIssues related to the client library for Azure AI EvaluationService AttentionWorkflow: This issue is responsible by Azure service team.customer-reportedIssues that are reported by GitHub users external to the Azure organization.needs-team-attentionWorkflow: This issue needs attention from Azure service team or SDK teamquestionThe issue doesn't require a change to the product in order to be resolved. Most issues start as that

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions