Skip to content

Use context for logging extra #4694

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: potel-base
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion MIGRATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Looking to upgrade from Sentry SDK 2.x to 3.x? Here's a comprehensive list of wh
- The default of `traces_sample_rate` changed to `0`. Meaning: Incoming traces will be continued by default. For example, if your frontend sends a `sentry-trace/baggage` headers pair, your SDK will create Spans and send them to Sentry. (The default used to be `None` meaning by default no Spans where created, no matter what headers the frontend sent to your project.) See also: https://docs.sentry.io/platforms/python/configuration/options/#traces_sample_rate
- `sentry_sdk.start_span` now only takes keyword arguments.
- `sentry_sdk.start_transaction`/`sentry_sdk.start_span` no longer takes the following arguments: `span`, `parent_sampled`, `trace_id`, `span_id` or `parent_span_id`.
- `sentry_sdk.continue_trace` no longer returns a `Transaction` and is now a context manager.
- `sentry_sdk.continue_trace` no longer returns a `Transaction` and is now a context manager.

- Use it to continue an upstream trace with the `sentry-trace` and `baggage` headers.

Expand Down Expand Up @@ -65,6 +65,7 @@ Looking to upgrade from Sentry SDK 2.x to 3.x? Here's a comprehensive list of wh
- Redis: In Redis pipeline spans there is no `span["data"]["redis.commands"]` that contains a dict `{"count": 3, "first_ten": ["cmd1", "cmd2", ...]}` but instead `span["data"]["redis.commands.count"]` (containing `3`) and `span["data"]["redis.commands.first_ten"]` (containing `["cmd1", "cmd2", ...]`).
- clickhouse-driver: The query is now available under the `db.query.text` span attribute (only if `send_default_pii` is `True`).
- Logging: By default, the SDK won't capture Sentry issues anymore when calling `logging.error()`, `logging.critical()` or `logging.exception()`. If you want to preserve the old behavior use `sentry_sdk.init(integrations=[LoggingIntegration(event_level="ERROR")])`.
- Logging: Logger `extra` is now added to events in `event["contexts"]["logging"]` instead of `event["extra"]`.
- The integration-specific content of the `sampling_context` argument of `traces_sampler` and `profiles_sampler` now looks different.

- The Celery integration doesn't add the `celery_job` dictionary anymore. Instead, the individual keys are now available as:
Expand Down
3 changes: 2 additions & 1 deletion sentry_sdk/integrations/logging.py
Original file line number Diff line number Diff line change
Expand Up @@ -279,7 +279,8 @@ def _emit(self, record: LogRecord) -> None:
"params": params,
}

event["extra"] = self._extra_from_record(record)
event.setdefault("contexts", {})
event["contexts"]["logging"] = self._extra_from_record(record)

sentry_sdk.capture_event(event, hint=hint)

Expand Down
4 changes: 2 additions & 2 deletions tests/integrations/logging/test_logging.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ def test_logging_extra_data(sentry_init, capture_events):
(event,) = events

assert event["level"] == "fatal"
assert event["extra"] == {"bar": 69}
assert event["contexts"]["logging"] == {"bar": 69}
assert any(
crumb["message"] == "bread" and crumb["data"] == {"foo": 42}
for crumb in event["breadcrumbs"]["values"]
Expand All @@ -110,7 +110,7 @@ def test_logging_extra_data_integer_keys(sentry_init, capture_events):

(event,) = events

assert event["extra"] == {"1": 1}
assert event["contexts"]["logging"] == {"1": 1}


@pytest.mark.parametrize(
Expand Down
32 changes: 9 additions & 23 deletions tests/test_scrubber.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import sys
import logging

from sentry_sdk import capture_exception, capture_event, start_span
from sentry_sdk import capture_exception, capture_event, start_span, set_extra
from sentry_sdk.integrations.logging import LoggingIntegration
from sentry_sdk.utils import event_from_exception
from sentry_sdk.scrubber import EventScrubber
Expand Down Expand Up @@ -119,38 +119,24 @@ def test_stack_var_scrubbing(sentry_init, capture_events):
}


def test_breadcrumb_extra_scrubbing(sentry_init, capture_events):
def test_extra_scrubbing(sentry_init, capture_events):
sentry_init(
max_breadcrumbs=2,
integrations=[LoggingIntegration(event_level="ERROR")],
)
events = capture_events()
logger.info("breadcrumb 1", extra=dict(foo=1, password="secret"))
logger.info("breadcrumb 2", extra=dict(bar=2, auth="secret"))
logger.info("breadcrumb 3", extra=dict(foobar=3, password="secret"))
logger.critical("whoops", extra=dict(bar=69, auth="secret"))

set_extra("bar", 69)
set_extra("auth", "secret")
try:
1 / 0
except ZeroDivisionError as e:
capture_exception(e)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Test Coverage Loss for Sensitive Data Scrubbing

The test_extra_scrubbing test (formerly test_breadcrumb_extra_scrubbing) lost critical coverage for sensitive data scrubbing. It no longer verifies that logging extra data is scrubbed, especially since this data now goes to event["contexts"]["logging"]. This gap could allow sensitive information like passwords or auth tokens to leak. The test also removed coverage for breadcrumb scrubbing and _meta assertions.

Fix in Cursor Fix in Web


(event,) = events

assert event["extra"]["bar"] == 69
assert event["extra"]["auth"] == "[Filtered]"
assert event["breadcrumbs"]["values"][0]["data"] == {
"bar": 2,
"auth": "[Filtered]",
}
assert event["breadcrumbs"]["values"][1]["data"] == {
"foobar": 3,
"password": "[Filtered]",
}

assert event["_meta"]["extra"]["auth"] == {"": {"rem": [["!config", "s"]]}}
assert event["_meta"]["breadcrumbs"] == {
"": {"len": 3},
"values": {
"0": {"data": {"auth": {"": {"rem": [["!config", "s"]]}}}},
"1": {"data": {"password": {"": {"rem": [["!config", "s"]]}}}},
},
}


def test_span_data_scrubbing(sentry_init, capture_events):
Expand Down
Loading