3.16.0
Estimated end-of-life date, accurate to within three months: 08-2026
See the support level definitions for more information.
Upgrade Notes
- This change updates library injection logic to work under Python 3.14.
- This change adds support and tests for Python 3.14 to much of the library's functionality. The following products and integrations still do not work with Python 3.14:
- Profiling
- IAST
- datastreams
- ci_visibility
- pytest
- django - django version 6.1, which will be compatible with Python 3.14, is not yet released
- django_hosts - django version 6.1, which will be compatible with Python 3.14, is not yet released
- djangorestframework - django version 6.1, which will be compatible with Python 3.14, is not yet released
- django:celery - django version 6.1, which will be compatible with Python 3.14, is not yet released
- dramatiq - dramatiq doesn't yet have a release supporting 3.14
- grpc_aio - some tests in the suite don't work with pytest-asyncio >= 1.0
- rq - rq doesn't work with python 3.14
- sqlite3 - pysqlite3-binary doesn't yet support python 3.14
- opentelemetry - opentelemetry-exporter-otlp doesn't yet work with Python 3.14
- openai - tiktoken doesn't yet work with Python 3.14
- ai_guard_langchain - tiktoken doesn't yet work with Python 3.14
- openai_agents
- langchain
- langgraph - tiktoken doesn't yet work with Python 3.14
- litellm - tiktoken doesn't yet work with Python 3.14
- google_generativeai - protobuf doesn't yet work with Python 3.14
- vertexai
- crewai - tiktoken doesn't yet work with Python 3.14
- ray - ray doesn't yet work with Python 3.14
- kafka - confluent-kafka doesn't yet work with Python 3.14
- aws_lambda - datadog-lambda doesn't yet work with Python 3.14
- llmobs - ragas doesn't yet work with Python 3.14
- appsec_integrations_fastapi
Deprecation Notes
- vertica: The vertica integration is deprecated and will be removed in a future version, around the same time that ddtrace drops support for Python 3.9.
New Features
-
opentelemetry: Adds default configurations for the OpenTelemetry Metrics API implementation to improve the Datadog user experience. This includes the following configurations:
OTEL_EXPORTER_OTLP_METRICS_ENDPOINTis set to the default Datadog Agent endpoint, or localhost if not foundOTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCEis set todeltaOTEL_METRIC_EXPORT_INTERVALis set to10000OTEL_METRIC_EXPORT_TIMEOUTis set to7500
-
LLM Observability: MCP integration also traces
ClientSessioncontexts,ClientSession.initialize, andClientSession.list_tools. -
ray: This introduces a Ray core integration that traces Ray jobs, remote tasks, and actor method calls. Supported for Ray >= 2.46.0.
To enable tracing, start the Ray head with
--tracing-startup-hook=ddtrace.contrib.ray:setup_tracingthen submit jobs as usual.
Bug Fixes
- AAP: This fix resolves an issue where stream endpoints with daphne/django where unresponsive due to an asyncio error.
- CI Visibility: This fix resolves an issue where code imported at module level but not executed during a test would not be considered by Test Impact Analysis as impacting the test. For example, a test using a constant imported from some other module would not count the constant definition among its impacting lines, because the constant definition is not executed during the test, but rather when the module was imported. With this change, code executed at import time is also included among the impacted lines of a test.
- google-adk: Fixes an
AttributeErrorthat could occur when tracing Google ADK agent runs, due to the agentmodelattribute not being defined forSequentialAgentclass. - opentelemetry: Fixes the parsing of OTLP metrics exporter configurations and the operation to automatically append the v1/metrics path to HTTP OTLP endpoints.
- langchain: Resolves an issue where langchain patching would throw an
ImportErrorfor when usinglangchain_core>=0.3.76. - LLM Observability
- ensures APM is disabled when
DD_APM_TRACING_ENABLED=0when using LLM Observability. - Resolves an issue where model IDs were not being parsed correctly if the model ID was an inference profile ID in the bedrock integration.
- enable the backend to differentiate AI Obs spans from other DJM spans, so that customers are not billed for AI Observability spans as part of their APM bill.
- ensures APM is disabled when
Other Changes
- sampling: Add more debug logs to help debug sampling issues.