Skip to content

Commit 54f2635

Browse files
committed
Merge branch 'potel-base' into antonpirker/potel/openai
2 parents 841dd4e + fdb5cdc commit 54f2635

35 files changed

+626
-194
lines changed

CHANGELOG.md

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,30 @@
11
# Changelog
22

3+
## 2.19.2
4+
5+
### Various fixes & improvements
6+
7+
- Deepcopy and ensure get_all function always terminates (#3861) by @cmanallen
8+
- Cleanup chalice test environment (#3858) by @antonpirker
9+
10+
## 2.19.1
11+
12+
### Various fixes & improvements
13+
14+
- Fix errors when instrumenting Django cache (#3855) by @BYK
15+
- Copy `scope.client` reference as well (#3857) by @sl0thentr0py
16+
- Don't give up on Spotlight on 3 errors (#3856) by @BYK
17+
- Add missing stack frames (#3673) by @antonpirker
18+
- Fix wrong metadata type in async gRPC interceptor (#3205) by @fdellekart
19+
- Rename launch darkly hook to match JS SDK (#3743) by @aliu39
20+
- Script for checking if our instrumented libs are Python 3.13 compatible (#3425) by @antonpirker
21+
- Improve Ray tests (#3846) by @antonpirker
22+
- Test with Celery `5.5.0rc3` (#3842) by @sentrivana
23+
- Fix asyncio testing setup (#3832) by @sl0thentr0py
24+
- Bump `codecov/codecov-action` from `5.0.2` to `5.0.7` (#3821) by @dependabot
25+
- Fix CI (#3834) by @sentrivana
26+
- Use new ClickHouse GH action (#3826) by @antonpirker
27+
328
## 2.19.0
429

530
### Various fixes & improvements

MIGRATION_GUIDE.md

Lines changed: 103 additions & 96 deletions
Original file line numberDiff line numberDiff line change
@@ -20,102 +20,109 @@ Looking to upgrade from Sentry SDK 2.x to 3.x? Here's a comprehensive list of wh
2020
- Redis integration: In Redis pipeline spans there is no `span["data"]["redis.commands"]` that contains a dict `{"count": 3, "first_ten": ["cmd1", "cmd2", ...]}` but instead `span["data"]["redis.commands.count"]` (containing `3`) and `span["data"]["redis.commands.first_ten"]` (containing `["cmd1", "cmd2", ...]`).
2121
- clickhouse-driver integration: The query is now available under the `db.query.text` span attribute (only if `send_default_pii` is `True`).
2222
- `sentry_sdk.init` now returns `None` instead of a context manager.
23-
- The `sampling_context` argument of `traces_sampler` now additionally contains all span attributes known at span start.
24-
- If you're using the Celery integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `celery_job` dictionary anymore. Instead, the individual keys are now available as:
25-
26-
| Dictionary keys | Sampling context key |
27-
| ---------------------- | -------------------- |
28-
| `celery_job["args"]` | `celery.job.args` |
29-
| `celery_job["kwargs"]` | `celery.job.kwargs` |
30-
| `celery_job["task"]` | `celery.job.task` |
31-
32-
Note that all of these are serialized, i.e., not the original `args` and `kwargs` but rather OpenTelemetry-friendly span attributes.
33-
34-
- If you're using the AIOHTTP integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `aiohttp_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:
35-
36-
| Request property | Sampling context key(s) |
37-
| ---------------- | ------------------------------- |
38-
| `path` | `url.path` |
39-
| `query_string` | `url.query` |
40-
| `method` | `http.request.method` |
41-
| `host` | `server.address`, `server.port` |
42-
| `scheme` | `url.scheme` |
43-
| full URL | `url.full` |
44-
45-
- If you're using the Tornado integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `tornado_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:
46-
47-
| Request property | Sampling context key(s) |
48-
| ---------------- | --------------------------------------------------- |
49-
| `path` | `url.path` |
50-
| `query` | `url.query` |
51-
| `protocol` | `url.scheme` |
52-
| `method` | `http.request.method` |
53-
| `host` | `server.address`, `server.port` |
54-
| `version` | `network.protocol.name`, `network.protocol.version` |
55-
| full URL | `url.full` |
56-
57-
- If you're using the generic WSGI integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `wsgi_environ` object anymore. Instead, the individual properties of the environment are accessible, if available, as follows:
58-
59-
| Env property | Sampling context key(s) |
60-
| ----------------- | ------------------------------------------------- |
61-
| `PATH_INFO` | `url.path` |
62-
| `QUERY_STRING` | `url.query` |
63-
| `REQUEST_METHOD` | `http.request.method` |
64-
| `SERVER_NAME` | `server.address` |
65-
| `SERVER_PORT` | `server.port` |
66-
| `SERVER_PROTOCOL` | `server.protocol.name`, `server.protocol.version` |
67-
| `wsgi.url_scheme` | `url.scheme` |
68-
| full URL | `url.full` |
69-
70-
- If you're using the generic ASGI integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `asgi_scope` object anymore. Instead, the individual properties of the scope, if available, are accessible as follows:
71-
72-
| Scope property | Sampling context key(s) |
73-
| -------------- | ------------------------------- |
74-
| `type` | `network.protocol.name` |
75-
| `scheme` | `url.scheme` |
76-
| `path` | `url.path` |
77-
| `query` | `url.query` |
78-
| `http_version` | `network.protocol.version` |
79-
| `method` | `http.request.method` |
80-
| `server` | `server.address`, `server.port` |
81-
| `client` | `client.address`, `client.port` |
82-
| full URL | `url.full` |
83-
84-
- If you're using the RQ integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `rq_job` object anymore. Instead, the individual properties of the job and the queue, if available, are accessible as follows:
85-
86-
| RQ property | Sampling context key(s) |
87-
| --------------- | ---------------------------- |
88-
| `rq_job.args` | `rq.job.args` |
89-
| `rq_job.kwargs` | `rq.job.kwargs` |
90-
| `rq_job.func` | `rq.job.func` |
91-
| `queue.name` | `messaging.destination.name` |
92-
| `rq_job.id` | `messaging.message.id` |
93-
94-
Note that `rq.job.args`, `rq.job.kwargs`, and `rq.job.func` are serialized and not the actual objects on the job.
95-
96-
- If you're using the AWS Lambda integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `aws_event` and `aws_context` objects anymore. Instead, the following, if available, is accessible:
97-
98-
| AWS property | Sampling context key(s) |
99-
| ------------------------------------------- | ----------------------- |
100-
| `aws_event["httpMethod"]` | `http.request.method` |
101-
| `aws_event["queryStringParameters"]` | `url.query` |
102-
| `aws_event["path"]` | `url.path` |
103-
| full URL | `url.full` |
104-
| `aws_event["headers"]["X-Forwarded-Proto"]` | `network.protocol.name` |
105-
| `aws_event["headers"]["Host"]` | `server.address` |
106-
| `aws_context["function_name"]` | `faas.name` |
107-
108-
- If you're using the GCP integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `gcp_env` and `gcp_event` keys anymore. Instead, the following, if available, is accessible:
109-
110-
| Old sampling context key | New sampling context key |
111-
| --------------------------------- | -------------------------- |
112-
| `gcp_env["function_name"]` | `faas.name` |
113-
| `gcp_env["function_region"]` | `faas.region` |
114-
| `gcp_env["function_project"]` | `gcp.function.project` |
115-
| `gcp_env["function_identity"]` | `gcp.function.identity` |
116-
| `gcp_env["function_entry_point"]` | `gcp.function.entry_point` |
117-
| `gcp_event.method` | `http.request.method` |
118-
| `gcp_event.query_string` | `url.query` |
23+
- The `sampling_context` argument of `traces_sampler` and `profiles_sampler` now additionally contains all span attributes known at span start.
24+
- The integration-specific content of the `sampling_context` argument of `traces_sampler` and `profiles_sampler` now looks different.
25+
- The Celery integration doesn't add the `celery_job` dictionary anymore. Instead, the individual keys are now available as:
26+
27+
| Dictionary keys | Sampling context key | Example |
28+
| ---------------------- | --------------------------- | ------------------------------ |
29+
| `celery_job["args"]` | `celery.job.args.{index}` | `celery.job.args.0` |
30+
| `celery_job["kwargs"]` | `celery.job.kwargs.{kwarg}` | `celery.job.kwargs.kwarg_name` |
31+
| `celery_job["task"]` | `celery.job.task` | |
32+
33+
Note that all of these are serialized, i.e., not the original `args` and `kwargs` but rather OpenTelemetry-friendly span attributes.
34+
35+
- The AIOHTTP integration doesn't add the `aiohttp_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:
36+
37+
| Request property | Sampling context key(s) |
38+
| ----------------- | ------------------------------- |
39+
| `path` | `url.path` |
40+
| `query_string` | `url.query` |
41+
| `method` | `http.request.method` |
42+
| `host` | `server.address`, `server.port` |
43+
| `scheme` | `url.scheme` |
44+
| full URL | `url.full` |
45+
| `request.headers` | `http.request.header.{header}` |
46+
47+
- The Tornado integration doesn't add the `tornado_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:
48+
49+
| Request property | Sampling context key(s) |
50+
| ----------------- | --------------------------------------------------- |
51+
| `path` | `url.path` |
52+
| `query` | `url.query` |
53+
| `protocol` | `url.scheme` |
54+
| `method` | `http.request.method` |
55+
| `host` | `server.address`, `server.port` |
56+
| `version` | `network.protocol.name`, `network.protocol.version` |
57+
| full URL | `url.full` |
58+
| `request.headers` | `http.request.header.{header}` |
59+
60+
- The WSGI integration doesn't add the `wsgi_environ` object anymore. Instead, the individual properties of the environment are accessible, if available, as follows:
61+
62+
| Env property | Sampling context key(s) |
63+
| ----------------- | ------------------------------------------------- |
64+
| `PATH_INFO` | `url.path` |
65+
| `QUERY_STRING` | `url.query` |
66+
| `REQUEST_METHOD` | `http.request.method` |
67+
| `SERVER_NAME` | `server.address` |
68+
| `SERVER_PORT` | `server.port` |
69+
| `SERVER_PROTOCOL` | `server.protocol.name`, `server.protocol.version` |
70+
| `wsgi.url_scheme` | `url.scheme` |
71+
| full URL | `url.full` |
72+
| `HTTP_*` | `http.request.header.{header}` |
73+
74+
- The ASGI integration doesn't add the `asgi_scope` object anymore. Instead, the individual properties of the scope, if available, are accessible as follows:
75+
76+
| Scope property | Sampling context key(s) |
77+
| -------------- | ------------------------------- |
78+
| `type` | `network.protocol.name` |
79+
| `scheme` | `url.scheme` |
80+
| `path` | `url.path` |
81+
| `query` | `url.query` |
82+
| `http_version` | `network.protocol.version` |
83+
| `method` | `http.request.method` |
84+
| `server` | `server.address`, `server.port` |
85+
| `client` | `client.address`, `client.port` |
86+
| full URL | `url.full` |
87+
| `headers` | `http.request.header.{header}` |
88+
89+
-The RQ integration doesn't add the `rq_job` object anymore. Instead, the individual properties of the job and the queue, if available, are accessible as follows:
90+
91+
| RQ property | Sampling context key | Example |
92+
| --------------- | ---------------------------- | ---------------------- |
93+
| `rq_job.args` | `rq.job.args.{index}` | `rq.job.args.0` |
94+
| `rq_job.kwargs` | `rq.job.kwargs.{kwarg}` | `rq.job.args.my_kwarg` |
95+
| `rq_job.func` | `rq.job.func` | |
96+
| `queue.name` | `messaging.destination.name` | |
97+
| `rq_job.id` | `messaging.message.id` | |
98+
99+
Note that `rq.job.args`, `rq.job.kwargs`, and `rq.job.func` are serialized and not the actual objects on the job.
100+
101+
- The AWS Lambda integration doesn't add the `aws_event` and `aws_context` objects anymore. Instead, the following, if available, is accessible:
102+
103+
| AWS property | Sampling context key(s) |
104+
| ------------------------------------------- | ------------------------------- |
105+
| `aws_event["httpMethod"]` | `http.request.method` |
106+
| `aws_event["queryStringParameters"]` | `url.query` |
107+
| `aws_event["path"]` | `url.path` |
108+
| full URL | `url.full` |
109+
| `aws_event["headers"]["X-Forwarded-Proto"]` | `network.protocol.name` |
110+
| `aws_event["headers"]["Host"]` | `server.address` |
111+
| `aws_context["function_name"]` | `faas.name` |
112+
| `aws_event["headers"]` | `http.request.headers.{header}` |
113+
114+
- The GCP integration doesn't add the `gcp_env` and `gcp_event` keys anymore. Instead, the following, if available, is accessible:
115+
116+
| Old sampling context key | New sampling context key |
117+
| --------------------------------- | ------------------------------ |
118+
| `gcp_env["function_name"]` | `faas.name` |
119+
| `gcp_env["function_region"]` | `faas.region` |
120+
| `gcp_env["function_project"]` | `gcp.function.project` |
121+
| `gcp_env["function_identity"]` | `gcp.function.identity` |
122+
| `gcp_env["function_entry_point"]` | `gcp.function.entry_point` |
123+
| `gcp_event.method` | `http.request.method` |
124+
| `gcp_event.query_string` | `url.query` |
125+
| `gcp_event.headers` | `http.request.header.{header}` |
119126

120127

121128
### Removed

docs/conf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
copyright = "2019-{}, Sentry Team and Contributors".format(datetime.now().year)
3232
author = "Sentry Team and Contributors"
3333

34-
release = "2.19.0"
34+
release = "2.19.2"
3535
version = ".".join(release.split(".")[:2]) # The short X.Y version.
3636

3737

scripts/ready_yet/main.py

Lines changed: 124 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
import time
2+
import re
3+
import sys
4+
5+
import requests
6+
7+
from collections import defaultdict
8+
9+
from pathlib import Path
10+
11+
from tox.config.cli.parse import get_options
12+
from tox.session.state import State
13+
from tox.config.sets import CoreConfigSet
14+
from tox.config.source.tox_ini import ToxIni
15+
16+
PYTHON_VERSION = "3.13"
17+
18+
MATCH_LIB_SENTRY_REGEX = r"py[\d\.]*-(.*)-.*"
19+
20+
PYPI_PROJECT_URL = "https://pypi.python.org/pypi/{project}/json"
21+
PYPI_VERSION_URL = "https://pypi.python.org/pypi/{project}/{version}/json"
22+
23+
24+
def get_tox_envs(tox_ini_path: Path) -> list:
25+
tox_ini = ToxIni(tox_ini_path)
26+
conf = State(get_options(), []).conf
27+
tox_section = next(tox_ini.sections())
28+
core_config_set = CoreConfigSet(
29+
conf, tox_section, tox_ini_path.parent, tox_ini_path
30+
)
31+
(
32+
core_config_set.loaders.extend(
33+
tox_ini.get_loaders(
34+
tox_section,
35+
base=[],
36+
override_map=defaultdict(list, {}),
37+
conf=core_config_set,
38+
)
39+
)
40+
)
41+
return core_config_set.load("env_list")
42+
43+
44+
def get_libs(tox_ini: Path, regex: str) -> set:
45+
libs = set()
46+
for env in get_tox_envs(tox_ini):
47+
match = re.match(regex, env)
48+
if match:
49+
libs.add(match.group(1))
50+
51+
return sorted(libs)
52+
53+
54+
def main():
55+
"""
56+
Check if libraries in our tox.ini are ready for Python version defined in `PYTHON_VERSION`.
57+
"""
58+
print(f"Checking libs from tox.ini for Python {PYTHON_VERSION} compatibility:")
59+
60+
ready = set()
61+
not_ready = set()
62+
not_found = set()
63+
64+
tox_ini = Path(__file__).parent.parent.parent.joinpath("tox.ini")
65+
66+
libs = get_libs(tox_ini, MATCH_LIB_SENTRY_REGEX)
67+
68+
for lib in libs:
69+
print(".", end="")
70+
sys.stdout.flush()
71+
72+
# Get latest version of lib
73+
url = PYPI_PROJECT_URL.format(project=lib)
74+
pypi_data = requests.get(url)
75+
76+
if pypi_data.status_code != 200:
77+
not_found.add(lib)
78+
continue
79+
80+
latest_version = pypi_data.json()["info"]["version"]
81+
82+
# Get supported Python version of latest version of lib
83+
url = PYPI_PROJECT_URL.format(project=lib, version=latest_version)
84+
pypi_data = requests.get(url)
85+
86+
if pypi_data.status_code != 200:
87+
continue
88+
89+
classifiers = pypi_data.json()["info"]["classifiers"]
90+
91+
if f"Programming Language :: Python :: {PYTHON_VERSION}" in classifiers:
92+
ready.add(lib)
93+
else:
94+
not_ready.add(lib)
95+
96+
# cut pypi some slack
97+
time.sleep(0.1)
98+
99+
# Print report
100+
print("\n")
101+
print(f"\nReady for Python {PYTHON_VERSION}:")
102+
if len(ready) == 0:
103+
print("- None ")
104+
105+
for x in sorted(ready):
106+
print(f"- {x}")
107+
108+
print(f"\nNOT ready for Python {PYTHON_VERSION}:")
109+
if len(not_ready) == 0:
110+
print("- None ")
111+
112+
for x in sorted(not_ready):
113+
print(f"- {x}")
114+
115+
print("\nNot found on PyPI:")
116+
if len(not_found) == 0:
117+
print("- None ")
118+
119+
for x in sorted(not_found):
120+
print(f"- {x}")
121+
122+
123+
if __name__ == "__main__":
124+
main()

scripts/ready_yet/requirements.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
requests
2+
tox

scripts/ready_yet/run.sh

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
#!/usr/bin/env bash
2+
3+
# exit on first error
4+
set -xe
5+
6+
reset
7+
8+
# create and activate virtual environment
9+
python -m venv .venv
10+
source .venv/bin/activate
11+
12+
# Install (or update) requirements
13+
python -m pip install -r requirements.txt
14+
15+
# Run the script
16+
python main.py

0 commit comments

Comments
 (0)