Skip to content

Commit 215e5b4

Browse files
authored
Add new Pipelines documentation + Wrangler docs (#25319)
* Added new Pipelines documentation + Wrangler docs * Add limits to pipelines docs * Updated title for pipelines overview page * Address PR feedback, made resource pages more concise, other minor changes * Update roll interval defaults * Added pipelines changelog * fixed typo in pipelines getting started
1 parent f4fcfd3 commit 215e5b4

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

52 files changed

+5261
-1688
lines changed
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
---
2+
title: Pipelines now supports SQL transformations and Apache Iceberg
3+
description: Transform streaming data with SQL and write to Apache Iceberg tables in R2
4+
date: 2025-09-25T13:00:00
5+
products:
6+
- pipelines
7+
hidden: true
8+
---
9+
10+
import { LinkCard } from "~/components";
11+
12+
Today, we're launching the new [Cloudflare Pipelines](/pipelines/): a streaming data platform that ingests events, transforms them with [SQL](/pipelines/sql-reference/select-statements/), and writes to [R2](/r2/) as [Apache Iceberg](https://iceberg.apache.org/) tables or Parquet files.
13+
14+
Pipelines can receive events via [HTTP endpoints](/pipelines/streams/writing-to-streams/#send-via-http) or [Worker bindings](/pipelines/streams/writing-to-streams/#send-via-workers), transform them with SQL, and deliver to R2 with exactly-once guarantees. This makes it easy to build analytics-ready warehouses for server logs, mobile application events, IoT telemetry, or clickstream data without managing streaming infrastructure.
15+
16+
For example, here's a pipeline that ingests clickstream events and filters out bot traffic while extracting domain information:
17+
18+
```sql
19+
INSERT into events_table
20+
SELECT
21+
user_id,
22+
lower(event) AS event_type,
23+
to_timestamp_micros(ts_us) AS event_time,
24+
regexp_match(url, '^https?://([^/]+)')[1] AS domain,
25+
url,
26+
referrer,
27+
user_agent
28+
FROM events_json
29+
WHERE event = 'page_view'
30+
AND NOT regexp_like(user_agent, '(?i)bot|spider');
31+
```
32+
33+
Get started by creating a pipeline in the dashboard or running a single command in [Wrangler](/workers/wrangler/):
34+
35+
```bash
36+
npx wrangler pipelines setup
37+
```
38+
39+
Check out our [getting started guide](/pipelines/getting-started/) to learn how to create a pipeline that delivers events to an [Iceberg table](/r2/data-catalog/) you can query with R2 SQL. Read more about today's announcement in our [blog post](https://blog.cloudflare.com/cloudflare-data-platform).

src/content/dash-routes/index.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -261,7 +261,7 @@
261261
},
262262
{
263263
"name": "Pipelines",
264-
"deeplink": "/?to=/:account/workers/pipelines",
264+
"deeplink": "/?to=/:account/pipelines",
265265
"parent": ["Storage & Databases"]
266266
},
267267
{

src/content/docs/pipelines/build-with-pipelines/index.mdx

Lines changed: 0 additions & 8 deletions
This file was deleted.

src/content/docs/pipelines/build-with-pipelines/output-settings.mdx

Lines changed: 0 additions & 103 deletions
This file was deleted.

src/content/docs/pipelines/build-with-pipelines/shards.mdx

Lines changed: 0 additions & 56 deletions
This file was deleted.

src/content/docs/pipelines/build-with-pipelines/sources/http.mdx

Lines changed: 0 additions & 90 deletions
This file was deleted.

src/content/docs/pipelines/build-with-pipelines/sources/index.mdx

Lines changed: 0 additions & 27 deletions
This file was deleted.

0 commit comments

Comments
 (0)