Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 39 additions & 0 deletions src/content/changelog/pipelines/2025-09-25-pipelines-sql.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
---
title: Pipelines now supports SQL transformations and Apache Iceberg
description: Transform streaming data with SQL and write to Apache Iceberg tables in R2
date: 2025-09-25T13:00:00
products:
- pipelines
hidden: true
---

import { LinkCard } from "~/components";

Today, we're launching the new [Cloudflare Pipelines](/pipelines/): a streaming data platform that ingests events, transforms them with [SQL](/pipelines/sql-reference/select-statements/), and writes to [R2](/r2/) as [Apache Iceberg](https://iceberg.apache.org/) tables or Parquet files.

Pipelines can receive events via [HTTP endpoints](/pipelines/streams/writing-to-streams/#send-via-http) or [Worker bindings](/pipelines/streams/writing-to-streams/#send-via-workers), transform them with SQL, and deliver to R2 with exactly-once guarantees. This makes it easy to build analytics-ready warehouses for server logs, mobile application events, IoT telemetry, or clickstream data without managing streaming infrastructure.

For example, here's a pipeline that ingests clickstream events and filters out bot traffic while extracting domain information:

```sql
INSERT into events_table
SELECT
user_id,
lower(event) AS event_type,
to_timestamp_micros(ts_us) AS event_time,
regexp_match(url, '^https?://([^/]+)')[1] AS domain,
url,
referrer,
user_agent
FROM events_json
WHERE event = 'page_view'
AND NOT regexp_like(user_agent, '(?i)bot|spider');
```

Get started by creating a pipeline in the dashboard or running a single command in [Wrangler](/workers/wrangler/):

```bash
npx wrangler pipelines setup
```

Check out our [getting started guide](/pipelines/getting-started/) to learn how to create a pipeline that delivers events to an [Iceberg table](/r2/data-catalog/) you can query with R2 SQL. Read more about today's announcement in our [blog post](https://blog.cloudflare.com/cloudflare-data-platform).
2 changes: 1 addition & 1 deletion src/content/dash-routes/index.json
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,7 @@
},
{
"name": "Pipelines",
"deeplink": "/?to=/:account/workers/pipelines",
"deeplink": "/?to=/:account/pipelines",
"parent": ["Storage & Databases"]
},
{
Expand Down
8 changes: 0 additions & 8 deletions src/content/docs/pipelines/build-with-pipelines/index.mdx

This file was deleted.

103 changes: 0 additions & 103 deletions src/content/docs/pipelines/build-with-pipelines/output-settings.mdx

This file was deleted.

56 changes: 0 additions & 56 deletions src/content/docs/pipelines/build-with-pipelines/shards.mdx

This file was deleted.

This file was deleted.

This file was deleted.

Loading
Loading