2.40.0
Various fixes & improvements
-
Add LiteLLM integration (#4864) by @constantinius
Once you've enabled the new LiteLLM integration, you can use the Sentry AI Agents Monitoring, a Sentry dashboard that helps you understand what's going on with your AI requests:import sentry_sdk from sentry_sdk.integrations.litellm import LiteLLMIntegration sentry_sdk.init( dsn="<your-dsn>", # Set traces_sample_rate to 1.0 to capture 100% # of transactions for tracing. traces_sample_rate=1.0, # Add data like inputs and responses; # see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info send_default_pii=True, integrations=[ LiteLLMIntegration(), ], )
-
Litestar: Copy request info to prevent cookies mutation (#4883) by @alexander-alderman-webb
-
Also emit spans for MCP tool calls done by the LLM (#4875) by @constantinius
-
Option to not trace HTTP requests based on status codes (#4869) by @alexander-alderman-webb
You can now disable transactions for incoming requests with specific HTTP status codes. The newtrace_ignore_status_codes
option accepts aset
of status codes as integers. If a transaction wraps a request that results in one of the provided status codes, the transaction will be unsampled.import sentry_sdk sentry_sdk.init( trace_ignore_status_codes={301, 302, 303, *range(305, 400), 404}, )
-
Move
_set_agent_data
call toai_client_span
function (#4876) by @constantinius -
Add script to determine lowest supported versions (#4867) by @sentrivana
-
Update
CONTRIBUTING.md
(#4870) by @sentrivana