Skip to content

Observability

Temp edited this page Dec 5, 2025 · 1 revision

External Observability Integration

D1 Manager supports integration with external observability platforms via Cloudflare's native OpenTelemetry (OTel) export. This allows you to send traces and logs to services like Grafana Cloud, Datadog, Honeycomb, Sentry, and Axiom.

Overview

There are two approaches to observability in D1 Manager:

  1. Cloudflare Native OpenTelemetry - Export traces and logs directly from Cloudflare Workers to OTLP-compatible endpoints
  2. Application Webhooks - Send event notifications to HTTP endpoints (see Webhooks)

OpenTelemetry Export (Recommended)

Cloudflare Workers natively supports exporting OpenTelemetry-compliant traces and logs to any OTLP endpoint.

Step 1: Create a Destination

  1. Go to Workers Observability in the Cloudflare Dashboard
  2. Click Add destination
  3. Configure your provider's OTLP endpoint and authentication headers

Step 2: Update wrangler.toml

Add observability configuration to your wrangler.toml:

[observability]
enabled = true

[observability.traces]
enabled = true
destinations = ["your-traces-destination"]

[observability.logs]
enabled = true
destinations = ["your-logs-destination"]

Step 3: Redeploy

npm run build
npx wrangler deploy

Provider-Specific Setup

Grafana Cloud

OTLP Endpoints:

  • Traces: https://otlp-gateway-{region}.grafana.net/otlp/v1/traces
  • Logs: https://otlp-gateway-{region}.grafana.net/otlp/v1/logs

Authentication:

  • Header: Authorization: Basic <base64(instanceId:apiKey)>

Setup:

  1. Get your Grafana Cloud instance ID and API key
  2. Create a destination in Cloudflare Dashboard with the endpoint and auth header
  3. Enable observability in wrangler.toml

Sentry

OTLP Endpoints:

  • Traces: https://{HOST}/api/{PROJECT_ID}/integration/otlp/v1/traces
  • Logs: https://{HOST}/api/{PROJECT_ID}/integration/otlp/v1/logs

Authentication:

  • Header: X-Sentry-DSN: <your-dsn>

Setup:

  1. Get your Sentry project ID and DSN
  2. Create a destination in Cloudflare Dashboard
  3. Enable observability in wrangler.toml

Datadog

OTLP Endpoints:

  • Traces: Coming soon via OTLP
  • Logs: https://otlp.{SITE}.datadoghq.com/v1/logs

Authentication:

  • Header: DD-API-KEY: <your-api-key>

Note: Datadog traces via OTLP are not yet available. Use Datadog's native integration or logs only for now.

Honeycomb

OTLP Endpoints:

  • Traces: https://api.honeycomb.io/v1/traces
  • Logs: https://api.honeycomb.io/v1/logs

Authentication:

  • Header: X-Honeycomb-Team: <your-api-key>

Axiom

OTLP Endpoints:

  • Traces: https://api.axiom.co/v1/traces
  • Logs: https://api.axiom.co/v1/logs

Authentication:

  • Header: Authorization: Bearer <your-api-key>

Alternative Options

Workers Analytics Engine

For custom metrics and usage-based analytics, use Workers Analytics Engine:

Step 1: Add binding to wrangler.toml

[[analytics_engine_datasets]]
binding = "D1_METRICS"
dataset = "d1_manager_metrics"

Step 2: Query via SQL API or Grafana

SELECT
  blob1 AS database_id,
  SUM(double1) AS total_export_bytes,
  COUNT(*) AS operation_count
FROM d1_manager_metrics
WHERE timestamp > NOW() - INTERVAL '7' DAY
GROUP BY blob1

Tail Workers

For real-time log processing, create a Tail Worker:

export default {
  async tail(events: TraceItem[]) {
    for (const event of events) {
      // Forward to your logging service
      await fetch('https://your-logging-service.com/ingest', {
        method: 'POST',
        body: JSON.stringify(event),
      });
    }
  }
};

Configure in wrangler.toml:

[[tail_consumers]]
service = "my-tail-worker"

Logpush

For structured log export to storage destinations:

  1. Go to Cloudflare Dashboard > Analytics > Logs
  2. Create a Logpush job for Workers Trace Events
  3. Select your destination (R2, S3, Azure, GCS, etc.)

Structured Error Logging

D1 Manager includes a centralized error logging system that produces structured logs:

[ERROR] [databases] [DB_CREATE_FAILED] Failed to create database (db: abc-123)

Log fields include:

  • timestamp: ISO timestamp
  • level: error, warning, or info
  • code: Module-prefixed error code (e.g., DB_CREATE_FAILED, TBL_DELETE_FAILED)
  • message: Human-readable error message
  • context: Module, operation, database ID, user ID, metadata

These structured logs are automatically exported when OpenTelemetry is enabled.

Best Practices

  1. Start with logs - Enable log export first, then add traces as needed
  2. Use sampling - For high-traffic deployments, configure trace sampling
  3. Set retention - Configure appropriate data retention in your observability platform
  4. Alert on errors - Set up alerts for job_failed events via webhooks or log queries
  5. Monitor latency - Use traces to identify slow operations

Troubleshooting

No logs appearing

  1. Verify destination is configured correctly in Cloudflare Dashboard
  2. Check that destination name matches wrangler.toml
  3. Redeploy after configuration changes
  4. Check Cloudflare Dashboard for delivery status

Authentication errors

  1. Verify API key/token is correct
  2. Check header format matches provider requirements
  3. Ensure endpoint URL is correct for your region

Missing traces

  1. Confirm traces are enabled in wrangler.toml
  2. Check trace sampling configuration
  3. Verify your observability platform supports OTLP traces

See also:

Clone this wiki locally