Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 10 additions & 15 deletions docs/platforms/python/integrations/celery/crons.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,16 @@ Sentry Crons allows you to monitor the uptime and performance of any scheduled,
Use the Celery integration to monitor your [Celery periodic tasks](https://docs.celeryq.dev/en/stable/userguide/periodic-tasks.html) and get notified when a task is missed (or doesn't start when expected), if it fails due to a problem in the runtime (such as an error), or if it fails by exceeding its maximum runtime.

<Note>
Please note that monitors will only be created on the task's first run.
Please note that monitors will only be created on the task's first run.
</Note>

First, set up your Celery beat schedule:

```python
```python {filename:tasks.py}
# tasks.py
from celery import Celery
from celery.schedules import crontab


app = Celery('tasks', broker='...')
app.conf.beat_schedule = {
'set-in-beat-schedule': {
Expand All @@ -30,30 +29,28 @@ app.conf.beat_schedule = {
},
}
```

<Note>
Please note that only crontab parseable schedules will be successfully upserted.
Please note that only crontab parseable schedules will be successfully
upserted.
</Note>

Next, we need to initialize Sentry. Where to do this depends on how you run beat:

- If beat is running in your worker process (that is, you're running your worker with the `-B`/`--beat` option), initialize Sentry in either the `celeryd_init` or `beat_init` signal.
- If beat is running in a separate process, you need to initialize Sentry in *both* the `celeryd_init` and `beat_init` signal.
- If beat is running in a separate process, you need to initialize Sentry in _both_ the `celeryd_init` and `beat_init` signal.

Make sure to also set `monitor_beat_tasks=True` in `CeleryIntegration`.


In addition to capturing errors, you can monitor interactions between multiple services or applications by [enabling tracing](/concepts/key-terms/tracing/). You can also collect and analyze performance profiles from real users with [profiling](/product/explore/profiling/).

Select which Sentry features you'd like to install in addition to Error Monitoring to get the corresponding installation and configuration instructions below.

<OnboardingOptionButtons
options={[
'error-monitoring',
'performance',
'profiling',
]}
options={["error-monitoring", "performance", "profiling"]}
/>

```python {"onboardingOptions": {"performance": "12-14", "profiling": "15-18"}}
```python {filename:tasks.py} {"onboardingOptions": {"performance": "12-14", "profiling": "15-18"}}
# tasks.py
from celery import signals

Expand Down Expand Up @@ -98,7 +95,6 @@ You don't need to create Cron Monitors for your tasks on Sentry.io, we'll do it

You can exclude Celery Beat tasks from being auto-instrumented. To do this, add a list of tasks you want to exclude as option `exclude_beat_tasks` when creating `CeleryIntegration`. The list can contain simple strings with the full task name, as specified in the Celery Beat schedule, or regular expressions to match multiple tasks.


```python
sentry_sdk.init(
# ...
Expand Down Expand Up @@ -128,7 +124,7 @@ Make sure the Sentry `@sentry_sdk.monitor` decorator is below Celery's `@app.tas

</Note>

```python
```python {filename:tasks.py}
# tasks.py
from celery import Celery, signals

Expand All @@ -143,7 +139,6 @@ def init_sentry(**kwargs):
# same as above
)


@app.task
@sentry_sdk.monitor(monitor_slug='<monitor-slug>') # 👈 this is the new line.
def tell_the_world(msg):
Expand Down
89 changes: 71 additions & 18 deletions docs/platforms/python/integrations/celery/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,36 +17,89 @@ pip install --upgrade 'sentry-sdk[celery]'

If you have the `celery` package in your dependencies, the Celery integration will be enabled automatically when you initialize the Sentry SDK.

Make sure that the **call to `init` is loaded on worker startup**, and not only in the module where your tasks are defined. Otherwise, the initialization happens too late and events might end up not being reported.
Make sure that the **call to `sentry_sdk.init()` is loaded on worker startup**, and not only in the module where your tasks are defined. Otherwise, the initialization happens too late and events might end up not being reported.

<PlatformContent includePath="getting-started-config" />
### Setup Celery (Without Django)

### Standalone Setup
To get the most out of Sentry make sure to initialize the Sentry SDK in your Celery worker processes as well as your application that is sending messages to Celery.

If you're using Celery standalone, there are two ways to set this up:
In addition to capturing errors, you can monitor interactions between multiple services or applications by [enabling tracing](/concepts/key-terms/tracing/). You can also collect and analyze performance profiles from real users with [profiling](/product/explore/profiling/).

- Initializing the SDK in the configuration file loaded with Celery's `--config` parameter
- Initializing the SDK by hooking it to either the [`celeryd_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#celeryd-init) or [`worker_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#worker-init) signals
Select which Sentry features you'd like to install in addition to Error Monitoring to get the corresponding installation and configuration instructions below.

```python
import sentry_sdk
from celery import Celery, signals
#### Setup in Celery

app = Celery("myapp")
<OnboardingOptionButtons
options={["error-monitoring", "performance", "profiling"]}
/>

#@signals.worker_init.connect
@signals.celeryd_init.connect
def init_sentry(**_kwargs):
sentry_sdk.init(...) # same as above
```
```python {filename:tasks.py} {"onboardingOptions": {"performance": "12-14", "profiling": "15-18"}}
from celery import Celery, signals
import sentry_sdk

# Initializing Celery
app = Celery("tasks", broker="...")

# Initialize Sentry SDK on Celery startup
@signals.celeryd_init.connect
def init_sentry(**_kwargs):
sentry_sdk.init(
dsn="___PUBLIC_DSN___",
# Set traces_sample_rate to 1.0 to capture 100%
# of transactions for tracing.
traces_sample_rate=1.0,
# Set profiles_sample_rate to 1.0 to profile 100%
# of sampled transactions.
# We recommend adjusting this value in production.
profiles_sample_rate=1.0,
)

# Task definitions go here
@app.task
def add(x, y):
return x + y
```

The [`celeryd_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#celeryd-init) signal is triggered when the Celery deamon is started, before the worker processes are spawned. You can use the [`worker_init`](https://docs.celeryq.dev/en/stable/userguide/signals.html?#worker-init) signal instead if you want to initialize Sentry on start of each worker process.

#### Setup in Your Application

<OnboardingOptionButtons
options={["error-monitoring", "performance", "profiling"]}
/>

```python {filename:main.py} {"onboardingOptions": {"performance": "8-10", "profiling": "11-14"}}
from tasks import add
import sentry_sdk

def main():
# Initializing Sentry SDK in our process
sentry_sdk.init(
dsn="___PUBLIC_DSN___",
# Set traces_sample_rate to 1.0 to capture 100%
# of transactions for tracing.
traces_sample_rate=1.0,
# Set profiles_sample_rate to 1.0 to profile 100%
# of sampled transactions.
# We recommend adjusting this value in production.
profiles_sample_rate=1.0,
)

# Enqueueing a task to be processed by Celery
with sentry_sdk.start_transaction(name="calling-a-celery-task"):
result = add.delay(4, 4)

if __name__ == "__main__":
main()
```

### Setup With Django
### Setup Celery With Django

If you're using Celery with Django in a conventional setup, have already initialized the SDK in [your `settings.py` file](/platforms/python/integrations/django/#configure), and have Celery using the same settings with [`config_from_object`](https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html), you don't need to initialize the SDK separately for Celery.
If you're using Celery with Django in a conventional setup, have already initialized the SDK in your `settings.py` file as described in the [Django integration documentation](/platforms/python/integrations/django/#configure), and have Celery using the same settings with [`config_from_object`](https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html), you don't need to initialize the SDK separately for Celery.

## Verify

To verify if your SDK is initialized on worker start, you can pass `debug=True` to `sentry_sdk.init()` to see extra output when the SDK is initialized. If the output appears during worker startup and not only after a task has started, then it's working properly.
To verify if your SDK is initialized on worker start, you can pass `debug=True` to `sentry_sdk.init()` to see extra output in your Celery logs when the SDK is initialized. If the output appears during worker startup and not only after a task has started, then it's working properly.

<Alert level="info" title="Note on distributed tracing">

Expand Down
Loading