Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 23 additions & 28 deletions docs/platforms/python/integrations/celery/crons.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,16 @@ Sentry Crons allows you to monitor the uptime and performance of any scheduled,
Use the Celery integration to monitor your [Celery periodic tasks](https://docs.celeryq.dev/en/stable/userguide/periodic-tasks.html) and get notified when a task is missed (or doesn't start when expected), if it fails due to a problem in the runtime (such as an error), or if it fails by exceeding its maximum runtime.

<Note>
Please note that monitors will only be created on the task's first run.
Please note that a cron monitor will only be created the first time your task runs.
</Note>

First, set up your Celery beat schedule:
Get started by setting up your Celery beat schedule:

```python
```python {filename:tasks.py}
# tasks.py
from celery import Celery
from celery.schedules import crontab


app = Celery('tasks', broker='...')
app.conf.beat_schedule = {
'set-in-beat-schedule': {
Expand All @@ -30,30 +29,28 @@ app.conf.beat_schedule = {
},
}
```

<Note>
Please note that only crontab parseable schedules will be successfully upserted.
Please note that only schedules that can be parsed by crontab will be successfully
updated or inserted.
</Note>

Next, we need to initialize Sentry. Where to do this depends on how you run beat:
Next, initialize Sentry. Where to do this depends on how you run beat:

- If beat is running in your worker process (that is, you're running your worker with the `-B`/`--beat` option), initialize Sentry in either the `celeryd_init` or `beat_init` signal.
- If beat is running in a separate process, you need to initialize Sentry in *both* the `celeryd_init` and `beat_init` signal.
- If beat is running in a separate process, you need to initialize Sentry in _both_ the `celeryd_init` and `beat_init` signal.

Make sure to also set `monitor_beat_tasks=True` in `CeleryIntegration`.


In addition to capturing errors, you can monitor interactions between multiple services or applications by [enabling tracing](/concepts/key-terms/tracing/). You can also collect and analyze performance profiles from real users with [profiling](/product/explore/profiling/).

Select which Sentry features you'd like to install in addition to Error Monitoring to get the corresponding installation and configuration instructions below.

<OnboardingOptionButtons
options={[
'error-monitoring',
'performance',
'profiling',
]}
options={["error-monitoring", "performance", "profiling"]}
/>

```python {"onboardingOptions": {"performance": "12-14", "profiling": "15-18"}}
```python {diff} {filename:tasks.py} {"onboardingOptions": {"performance": "12-14", "profiling": "15-18"}}
# tasks.py
from celery import signals

Expand All @@ -72,11 +69,11 @@ def init_sentry(**kwargs):
# of sampled transactions.
# We recommend adjusting this value in production.
profiles_sample_rate=1.0,
integrations=[
CeleryIntegration(
monitor_beat_tasks=True
)
],
+ integrations=[
+ CeleryIntegration(
+ monitor_beat_tasks=True
+ )
+ ],
environment="local.dev.grace",
release="v1.0",
)
Expand All @@ -98,17 +95,16 @@ You don't need to create Cron Monitors for your tasks on Sentry.io, we'll do it

You can exclude Celery Beat tasks from being auto-instrumented. To do this, add a list of tasks you want to exclude as option `exclude_beat_tasks` when creating `CeleryIntegration`. The list can contain simple strings with the full task name, as specified in the Celery Beat schedule, or regular expressions to match multiple tasks.


```python
```python {diff}
sentry_sdk.init(
# ...
integrations=[
CeleryIntegration(
monitor_beat_tasks=True,
exclude_beat_tasks=[
"some-task-a",
"payment-check-.*",
]
+ exclude_beat_tasks=[
+ "some-task-a",
+ "payment-check-.*",
+ ]
),
],
)
Expand All @@ -128,7 +124,7 @@ Make sure the Sentry `@sentry_sdk.monitor` decorator is below Celery's `@app.tas

</Note>

```python
```python {diff} {filename:tasks.py}
# tasks.py
from celery import Celery, signals

Expand All @@ -143,9 +139,8 @@ def init_sentry(**kwargs):
# same as above
)


@app.task
@sentry_sdk.monitor(monitor_slug='<monitor-slug>') # 👈 this is the new line.
+@sentry_sdk.monitor(monitor_slug='<monitor-slug>')
def tell_the_world(msg):
print(msg)
```
108 changes: 90 additions & 18 deletions docs/platforms/python/integrations/celery/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,36 +17,108 @@ pip install --upgrade 'sentry-sdk[celery]'

If you have the `celery` package in your dependencies, the Celery integration will be enabled automatically when you initialize the Sentry SDK.

Make sure that the **call to `init` is loaded on worker startup**, and not only in the module where your tasks are defined. Otherwise, the initialization happens too late and events might end up not being reported.
<Alert>
Make sure that the call to `sentry_sdk.init()` is loaded on worker startup and
not only in the module where your tasks are defined. Otherwise, the
initialization may happen too late and events might not get reported.
</Alert>

### Set up Celery Without Django

<PlatformContent includePath="getting-started-config" />
When using Celery without Django, you'll need to initialize the Sentry SDK in both your application and the Celery worker processes spawned by the Celery daemon.

### Standalone Setup
In addition to capturing errors, you can use Sentry for [distributed tracing](/concepts/key-terms/tracing/) and [profiling](/product/explore/profiling/). Select what you'd like to install to get the corresponding installation and configuration instructions below.

If you're using Celery standalone, there are two ways to set this up:
#### Set up Sentry in Celery Daemon or Worker Processes

- Initializing the SDK in the configuration file loaded with Celery's `--config` parameter
- Initializing the SDK by hooking it to either the [`celeryd_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#celeryd-init) or [`worker_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#worker-init) signals
<OnboardingOptionButtons
options={["error-monitoring", "performance", "profiling"]}
/>

```python
import sentry_sdk
from celery import Celery, signals
```python {filename:tasks.py} {"onboardingOptions": {"performance": "12-14", "profiling": "15-18"}}
from celery import Celery, signals
import sentry_sdk

# Initializing Celery
app = Celery("tasks", broker="...")

# Initialize Sentry SDK on Celery startup
@signals.celeryd_init.connect
def init_sentry(**_kwargs):
sentry_sdk.init(
dsn="___PUBLIC_DSN___",
# Set traces_sample_rate to 1.0 to capture 100%
# of transactions for tracing.
traces_sample_rate=1.0,
# Set profiles_sample_rate to 1.0 to profile 100%
# of sampled transactions.
# We recommend adjusting this value in production.
profiles_sample_rate=1.0,
)

# Task definitions go here
@app.task
def add(x, y):
return x + y
```

app = Celery("myapp")
The [`celeryd_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#celeryd-init) signal is triggered when the Celery daemon starts, before the worker processes are spawned. If you need to initialize Sentry for each individual worker process, us the [`worker_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#worker-init) signal instead.

#@signals.worker_init.connect
@signals.celeryd_init.connect
def init_sentry(**_kwargs):
sentry_sdk.init(...) # same as above
```
#### Set up Sentry in Your Application

### Setup With Django
<OnboardingOptionButtons
options={["error-monitoring", "performance", "profiling"]}
/>

```python {filename:main.py} {"onboardingOptions": {"performance": "8-10", "profiling": "11-14"}}
from tasks import add
import sentry_sdk

def main():
# Initializing Sentry SDK in our process
sentry_sdk.init(
dsn="___PUBLIC_DSN___",
# Set traces_sample_rate to 1.0 to capture 100%
# of transactions for tracing.
traces_sample_rate=1.0,
# Set profiles_sample_rate to 1.0 to profile 100%
# of sampled transactions.
# We recommend adjusting this value in production.
profiles_sample_rate=1.0,
)

# Enqueueing a task to be processed by Celery
with sentry_sdk.start_transaction(name="calling-a-celery-task"):
result = add.delay(4, 4)

if __name__ == "__main__":
main()
```

If you're using Celery with Django in a conventional setup, have already initialized the SDK in [your `settings.py` file](/platforms/python/integrations/django/#configure), and have Celery using the same settings with [`config_from_object`](https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html), you don't need to initialize the SDK separately for Celery.
### Set up Celery With Django

If you're using Celery with Django in a typical setup, have initialized the SDK in your `settings.py` file (as described in the [Django integration documentation](/platforms/python/integrations/django/#configure)), and have your Celery configured to use the same settings as [`config_from_object`](https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html), there's no need to initialize the Celery SDK separately.

## Verify

To verify if your SDK is initialized on worker start, you can pass `debug=True` to `sentry_sdk.init()` to see extra output when the SDK is initialized. If the output appears during worker startup and not only after a task has started, then it's working properly.
To confirm that your SDK is initialized on worker start, pass `debug=True` to `sentry_sdk.init()`. This will add extra output to your Celery logs when the SDK is initialized. If you see the output during worker startup, and not just after a task has started, then it's working correctly.

The snippet below includes an intentional `ZeroDivisionError` in the Celery task that will be captured by Sentry. To trigger the error call `debug_sentry.delay()`:

```python {filename:tasks.py} {"onboardingOptions": {"performance": "12-14", "profiling": "15-18"}}
from celery import Celery, signals
import sentry_sdk

app = Celery("tasks", broker="...")

@signals.celeryd_init.connect
def init_sentry(**_kwargs):
sentry_sdk.init(...) # same as above

@app.task
def debug_sentry():
1/0
```

<Alert level="info" title="Note on distributed tracing">

Expand Down
Loading