Skip to content

opentelemetry dependency of pydantic-ai makes AWS lambda crash at import time #2985

@fedexman

Description

@fedexman

Initial Checks

Description

When running pydantic-ai in a AWS lambda (3.12 or 3.13), pydantic ai imports opentelemetry:
from opentelemetry.trace import NoOpTracer, use_span
that has some init of _load_runtime_context at import time
in /opentelemetry/context/__init__.py

this runtime context function double fails 😅

[ERROR] StopIteration
Traceback (most recent call last):
  File "/var/lang/lib/python3.13/importlib/__init__.py", line 88, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 1026, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/var/task/handlers/dummy_lambda.py", line 1, in <module>
    import pydantic_ai  # noqa: F401
  File "/opt/python/pydantic_ai/__init__.py", line 3, in <module>
    from .agent import (
  File "/opt/python/pydantic_ai/agent/__init__.py", line 13, in <module>
    from opentelemetry.trace import NoOpTracer, use_span
  File "/opt/python/opentelemetry/trace/__init__.py", line 85, in <module>
    from opentelemetry import context as context_api
  File "/opt/python/opentelemetry/context/__init__.py", line 70, in <module>
    _RUNTIME_CONTEXT = _load_runtime_context()
  File "/opt/python/opentelemetry/context/__init__.py", line 60, in _load_runtime_context
    return next(  # type: ignore

I tried looking into how to fix open-telemetry for lambda, by tinkering the ENV variable or adding opentelemetry lambda layers. I was not able to fix this problem, this bug is not technically from pydantic-ai as it fails in opentelemetry, or maybe a bug of packaging from serverless framework, but I was wondering if this opentelemetry feature is necessary of is it something optional that could be deactivated ?

Example Code

lambda handler

import pydantic_ai  # noqa: F401
from aws_lambda_powertools.event_handler.exceptions import InternalServerError
from aws_lambda_powertools.utilities.typing import LambdaContext


def handler(event: dict, context: LambdaContext):
    LOGGER.info("Received event: %s", event)
    try:
        return {"statusCode": 200, "body": "Hello from Dummy Lambda!"}
    except Exception as err:
        LOGGER.exception("Lambda encountered an error.")
        raise InternalServerError("Lambda encountered an error.") from err

I'm using serverless to deploy the lambda and package the dependencies, serverless config:

  service: test
  frameworkVersion: "3"

  provider:
    name: aws
    runtime: python3.13
    stage: ${opt:stage, 'dev'}
    region: ${opt:region, 'ap-northeast-1'}

  functions:
    dummy-lambda:
      handler: handlers.dummy_lambda.handler
      layers:
        - !Ref PythonRequirementsLambdaLayer

  plugins:
    - serverless-python-requirements

  custom:
    pythonRequirements:
      dockerizePip: true
      layer: true
      slim: true

Python, Pydantic AI & LLM client version

python3.13
pydantic-ai-slim[openai]==1.0.8
(using slim as pydantic-ai is too big for a base lambda)

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions