Releases: DataDog/dd-trace-py
2.14.0
Deprecation Notes
- Tracing
- Deprecates the
DD_TRACE_SPAN_AGGREGATOR_RLOCKenvironment variable. It will be removed in v3.0.0. - Deprecates support for APM Legacy App Analytics. This feature and its associated configuration options are deprecated and will be removed in v3.0.0.
DD_HTTP_CLIENT_TAG_QUERY_STRINGconfiguration is deprecated and will be removed in v3.0.0. UseDD_TRACE_HTTP_CLIENT_TAG_QUERY_STRINGinstead.
- Deprecates the
New Features
-
DSM
- Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
- Introduces new tracing and datastreams monitoring functionality for Google Protobuf.
-
LLM Observability
- Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
- The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
- The LangChain integration now submits tool spans to LLM Observability.
- LLM Observability spans generated by the OpenAI integration now have updated span name and
model_providervalues. Span names are now prefixed with the OpenAI client name (possible values:OpenAI/AzureOpenAI) instead of the defaultopenaiprefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. Themodel_providerfield also now corresponds toopenaiorazure_openaibased on the OpenAI client. - The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}option is set on the completion or chat completion call. - Introduces the
LLMObs.annotation_context()context manager method, which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active. - Introduces prompt template annotation, which can be passed as an argument to
LLMObs.annotate(prompt={...})for LLM span kinds. For more information on prompt annotations, see the docs. - google_generativeai: Introduces tracing support for Google Gemini API
generate_contentcalls.
See the docs for more information. - openai: The OpenAI integration now includes a new
openai.request.clienttag with the possible valuesOpenAI/AzureOpenAIto help differentiate whether the request was made to Azure OpenAI or OpenAI. - openai: The OpenAI integration now captures token data from streamed completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}option is set on the completion or chat completion call.
-
Profiling
- Captures
asyncio.Lockusages withwithcontext managers.
- Captures
-
Other
- botocore: Adds span pointers to some successful AWS botocore spans. Currently only supports S3 PutObject.
- pymongo: Adds support for pymongo>=4.9.0
Bug Fixes
-
Code Security (ASM)
- Fixes a bug in the IAST patching process where
AttributeErrorexceptions were being caught, interfering with the proper application cycle. - Resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.
- Fixes a bug in the IAST patching process where
-
LLM Observability
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
LLMObs.enable(agentless_enabled=True)or settingDD_LLMOBS_AGENTLESS_ENABLED=1. - Resolves an issue in the
LLMObs.annotate()method where non-JSON serializable arguments were discarded entirely. Now, theLLMObs.annotate()method safely handles non-JSON-serializable arguments by defaulting to a placeholder text. - Resolves an issue where attempting to tag non-JSON serializable request/response parameters resulted in a
TypeErrorin the OpenAI, LangChain, Bedrock, and Anthropic integrations. - anthropic: Resolves an issue where attempting to tag non-JSON serializable request arguments caused a
TypeError. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text. - langchain: Resolves an issue where attempting to tag non-JSON serializable tool config arguments resulted in a
TypeError. The LangChain integration now safely tags non-JSON serializable arguments with a default placeholder text.
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
-
Other
- SSI: This fix ensures injection denylist is included in published OCI package.
- postgres: Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- pymongo: Ensures instances of the
pymongo.MongoClientcan be patch after pymongo is imported.
2.14.0rc1
Deprecation Notes
- tracing: Deprecates the
DD_TRACE_SPAN_AGGREGATOR_RLOCKenvironment variable. It will be removed in 3.0.0. - tracing: Deprecates support for APM Legacy App Analytics. This feature and its associated configuration options are deprecated and will be removed in a future release.
- tracing:
DD_HTTP_CLIENT_TAG_QUERY_STRINGconfiguration is deprecated and will be removed in v3.0.0. UseDD_TRACE_HTTP_CLIENT_TAG_QUERY_STRINGinstead.
New Features
-
google_generativeai: Introduces tracing support for Google Gemini API
generate_contentcalls.
See the docs for more information. -
LLM Observability: Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
-
LLM Observability: The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
-
LLM Observability: The LangChain integration now submits tool spans to LLM Observability.
-
openai: The OpenAI integration now includes a new
openai.request.clienttag with the possible valuesOpenAI/AzureOpenAIto help differentiate whether the request was made to Azure OpenAI or OpenAI. -
LLM Observability: LLM Observability spans generated by the OpenAI integration now have updated span name and
model_providervalues. Span names are now prefixed with the OpenAI client name (possible values:OpenAI/AzureOpenAI) instead of the defaultopenaiprefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. Themodel_providerfield also now corresponds toopenaiorazure_openaibased on the OpenAI client. -
openai: The OpenAI integration now captures token data from streamed completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}option is set on the completion or chat completion call. -
LLM Observability: The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}option is set on the completion or chat completion call. -
DSM: Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
-
DSM: Introduces new tracing and datastreams monitoring functionality for Google Protobuf.
-
LLM Observability: Introduces the
LLMObs.annotation_context()context manager method which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active. -
profiling: Captures
asyncio.Lockusages withwithcontext managers -
pymongo: Adds support for pymongo>=4.9.0
-
botocore: Adds span pointers to some successful AWS botocore spans. Currently only supports S3 PutObject.
-
LLM Observability: Introduces prompt template annotation, which can be passed as an argument to
LLMObs.annotate(prompt={...})for LLM span kinds. For more information on prompt annotations, see Annotating a Span.
Bug Fixes
- library injection: Resolves an issue where the version of
attrsinstalled by default on some Ubuntu installations was treated as incompatible with library injection - anthropic: Resolves an issue where attempting to tag non-JSON serializable request arguments caused a
TypeError. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text. - LLM Observability: Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
LLMObs.enable(agentless_enabled=True)or settingDD_LLMOBS_AGENTLESS_ENABLED=1. - LLM Observability: Resolves an issue in the
LLMObs.annotate()method where non-JSON serializable arguments were discarded entirely. Now, theLLMObs.annotate()method safely handles non-JSON-serializable arguments by defaulting to a placeholder text. - LLM Observability: Resolves an issue where attempting to tag non-JSON serializable request/response parameters resulted in a
TypeErrorin the OpenAI, LangChain, Bedrock, and Anthropic integrations. - langchain: Resolves an issue where attempting to tag non-JSON serializable tool config arguments resulted in a
TypeError. The LangChain integration now safely tags non-JSON serializable arguments with a default placeholder text. - SSI: This fix ensures injection denylist is included in published OCI package.
- postgres: Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- ASM: This fix resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.
- Code Security: This fixes a bug in the IAST patching process where
AttributeErrorexceptions were being caught, interfering with the proper application cycle. - kafka: Fixes an issue where a
TypeErrorexception would be raised if the first message'stopic()returnedNoneduring consumption. - kinesis: Resolves an issue where unparsable data in a Kinesis record would cause a NoneType error.
- pymongo: Ensures instances of the
pymongo.MongoClientcan be patch after pymongo is imported.
2.12.2
Bug Fixes
- library injection: Resolves an issue where the version of
attrsinstalled by default on some Ubuntu installations was treated as incompatible with library injection - Code Security: This fixes a bug in the IAST patching process where
AttributeErrorexceptions were being caught, interfering with the proper application cycle.
2.11.6
Bug Fixes
- library injection: Resolves an issue where the version of
attrsinstalled by default on some Ubuntu installations was treated as incompatible with library injection - Code Security: This fixes a bug in the IAST patching process where
AttributeErrorexceptions were being caught, interfering with the proper application cycle.
2.12.1
Bug Fixes
- SSI: This fix ensures injection denylist is included in published OCI package.
2.11.5
Bug Fixes
- SSI: This fix ensures injection denylist is included in published OCI package.
2.13.0rc1
Deprecation Notes
- tracing: All public patch modules are deprecated. The non-deprecated methods are included in the
__all__attribute. - yaaredis: The yaaredis integration is deprecated and will be removed in a future version. As an alternative to the yaaredis integration, the redis integration should be used.
- tracing: Deprecates the
priority_samplingargument inddtrace.tracer.Tracer.configure(...).
New Features
- Datastreams Monitoring (DSM): Adds support for schema tracking.
- debugging: Exception Replay will capture any exceptions that are manually attached to a span with a call to
set_exc_info. - LLM Observability: The LangChain integration now submits vectorstore
similarity_searchspans to LLM Observability as retrieval spans. - langchain : Adds support for tracing tool invocations.
- LLM Observability: Adds support for capturing tool calls returned from LangChain chat completions.
- LLM Observability: Introduces the ability to set
ml_appandtimestamp_msfields inLLMObs.submit_evaluation. - openai: Introduces
modeltag for openai integration metrics for consistency with the OpenAI SaaS Integration. It has the same value asopenai.request.model.
Bug Fixes
-
CI Visibility: Resolves an issue where exceptions other than timeouts and connection errors raised while fetching the list of skippable tests for ITR were not being handled correctly and caused the tracer to crash.
-
CI Visibility: Fixes a bug where
.gitwas incorrectly being stripped from repository URLs when extracting service names, resulting ing,i, ortbeing removed (eg:test-environment.gitincorrectly becomingtest-environmen) -
botocore: Resolves a regression where trace context was not being injected into the input of Stepfunction
start_executioncommands. This re-enables distributed tracing when a Python service invokes a properly instrumented Step Function. -
LLM Observability: Resolves an issue where custom trace filters were being overwritten in forked processes.
-
LLM Observability: Resolves an issue where LLM Observability spans were not being submitted in forked processes, such as when using
celeryorgunicornworkers. The LLM Observability writer thread now automatically restarts when a forked process is detected. -
tracing: Fix for a side-effect issue with module import callbacks that could cause a runtime exception.
-
tracing: Fixes an issue with some module imports with native specs that don't support attribute assignments, resulting in a
TypeErrorexception at runtime. -
tracing: Improves the accuracy of
X-Datadog-Trace-Countpayload header. -
tracing: Resolves an issue where
ddtracepackage files were published with incorrect file attributes. -
tracing: Resolves an issue where django db instrumentation could fail.
-
LLM Observability: Resolves an issue where
session_idwas being defaulted totrace_id, which was causing unexpected UI behavior. -
openai: Fixes a bug where
asyncio.TimeoutErrors were not being propagated correctly from canceled OpenAI API requests. -
profiling: Propagates tags in
DD_PROFILING_TAGSandDD_TAGSto the libdatadog exporter, new exporter codepath which is enabled when either one of the following is set,DD_PROFILING_STACK_V2_ENABLED,DD_PROFILING_EXPORT_LIBDD_ENABLED, orDD_PROFILING_TIMELINE_ENABLEDor dd-trace-py is running in an injected environment. -
Security: Fix a memory leak on the native slice aspect.
Other Changes
- tracing: Removes the
DD_PRIORITY_SAMPLINGconfiguration option. This option is not used in anyddtrace>=2.0releases.
2.12.0
New Features
- openai: Introduces the
modeltag for openai integration metrics for consistency with the OpenAI SaaS Integration. It has the same value asopenai.request.model. - database_clients: Adds
server.addresstag to all<database>.queryspans (ex: postgres.query). This tag stores the name of the database host. - LLM Observability: Flushes the buffer of spans to be sent when the payload size would otherwise exceed the payload size limit for the event platform.
- LLM Observability: Span events that exceed the event platform event size limit (1 MB) will now have their inputs and outputs dropped.
- tracing: Adds
ddtrace.trace.Contextto the public api. This class can now be used to propagate context across execution boundaries (ex: threads).
Deprecation Notes
- config:
DD_TRACE_128_BIT_TRACEID_LOGGING_ENABLEDis deprecated. Trace id logging format is now configured automatically. - tracing: Deprecates all modules in the
ddtrace.contrib.[integration_name]package. Use attributes exposed inddtrace.contrib.[integration_name].__all__instead. The following are impacted:aioredis,algoliasearch.anthropic,aredis,asgi,asyncpg,aws_lambda,boto,botocore,bottle,cassandra,celery,cherrypy,consul,coverage,django,dogpile_cache,dramatiq,elasticsearch,falcon,fastapi,flask,flask_cache,futures,gevent,graphql,grpc,httplib,httpx,jinja2,kafka,kombu,langchain,logbook,logging,loguru,mako,mariadb,molten,mongoengine,mysql,mysqldb,openai,psycopg,pylibmc,pymemcache,pymongo,pymysql,pynamodb,pyodbc,pyramid,redis,rediscluster,requests,sanic,snowflake,sqlalchemy,sqlite3,starlette,structlog,subprocess,tornado,urllib,urllib3,vertica,webbrowser,wsgi,yaaredis
Bug Fixes
-
CI Visibility: Resolves an issue where exceptions other than timeouts and connection errors raised while fetching the list of skippable tests for ITR were not being handled correctly and caused the tracer to crash.
-
CI Visibility: Fixes a bug where
.gitwas incorrectly being stripped from repository URLs when extracting service names, resulting ing,i, ortbeing removed (eg:test-environment.gitincorrectly becomingtest-environmen) -
LLM Observability: Resolves an issue where custom trace filters were being overwritten in forked processes.
-
tracing: Fixes a side-effect issue with module import callbacks that could cause a runtime exception.
-
LLM Observability: Resolves an issue where
session_idwas being defaulted totrace_id, which was causing unexpected UI behavior. -
LLM Observability: Resolves an issue where LLM Observability spans were not being submitted in forked processes, such as when using
celeryorgunicornworkers. The LLM Observability writer thread now automatically restarts when a forked process is detected. -
tracing: Fixes an issue with some module imports with native specs that don't support attribute assignments, resulting in a
TypeErrorexception at runtime. -
tracing: Resolves an issue where
ddtracepackage files were published with incorrect file attributes. -
tracing: Resolves an issue where django db instrumentation could fail.
-
openai: Fixes a bug where
asyncio.TimeoutErrors were not being propagated correctly from canceled OpenAI API requests. -
aiobotocore: Fixes an issue where the
_make_api_callarguments were not captured correctly when using keyword arguments. -
tracing(django): Resolves a bug where ddtrace was exhausting a Django stream response before returning it to user.
-
LLM Observability: Fixes an issue in the OpenAI integration where integration metrics would still be submitted even if
LLMObs.enable(agentless_enabled=True)was set. -
internal: Fixes the
Already mutably borrowederror when rate limiter is accessed across threads. -
internal: Fixes the
Already mutably borrowederror by reverting back to pure-python rate limiter. -
Code Security: Adds null pointer checks when creating new objects ids.
-
profiling: Fixes an issue where the profiler could erroneously try to load protobuf in autoinjected environments, where it is not available.
-
crashtracking: Fixes an issue where crashtracking environment variables for Python were inconsistent with those used by other runtimes.
-
profiling: Fixes endpoint profiling for stack v2 when
DD_PROFILING_STACK_V2_ENABLEDis set. -
profiling: Turns on the new native exporter when
DD_PROFILING_TIMELINE_ENABLED=Trueis set.
2.11.4
Bug Fixes
- CI Visibility: Resolves an issue where exceptions other than timeouts and connection errors raised while fetching the list of skippable tests for ITR were not being handled correctly and caused the tracer to crash.
- CI Visibility: Fixes a bug where
.gitwas incorrectly being stripped from repository URLs when extracting service names, resulting ing,i, ortbeing removed (eg:test-environment.gitincorrectly becomingtest-environmen) - LLM Observability: Resolves an issue where custom trace filters were being overwritten in forked processes.
- tracing: Fixes a side-effect issue with module import callbacks that could cause a runtime exception.
- LLM Observability: Resolves an issue where
session_idwas being defaulted totrace_idwhich was causing unexpected UI behavior.
2.10.7
Bug Fixes
- CI Visibility: Resolves an issue where exceptions other than timeouts and connection errors raised while fetching the list of skippable tests for ITR were not being handled correctly and caused the tracer to crash.
- CI Visibility: Fixes a bug where
.gitwas incorrectly being stripped from repository URLs when extracting service names, resulting ing,i, ortbeing removed (eg:test-environment.gitincorrectly becomingtest-environmen) - openai: Fixes a bug where
asyncio.TimeoutErrors were not being propagated correctly from canceled OpenAI API requests. - profiling: Fixes endpoing profiling for stack v2 when
DD_PROFILING_STACK_V2_ENABLEDis set.