diff --git a/public/__redirects b/public/__redirects
index 82ecccc6e16b91..f3438e8303a0ca 100644
--- a/public/__redirects
+++ b/public/__redirects
@@ -2165,6 +2165,7 @@
/cloudflare-one/policies/browser-isolation/agentless/* /cloudflare-one/policies/browser-isolation/setup/:splat 301
/cloudflare-one/policies/filtering/http-policies/data-loss-prevention/* /cloudflare-one/policies/data-loss-prevention/ 301
/cloudflare-one/policies/data-loss-prevention/configuration-guides/* /cloudflare-one/policies/data-loss-prevention/dlp-policies/common-policies/ 301
+/cloudflare-one/policies/data-loss-prevention/datasets/* /cloudflare-one/policies/data-loss-prevention/detection-entries/:splat 301
# Learning paths
diff --git a/src/content/changelog/dlp/2025-05-12-case-sensitive-cwl.mdx b/src/content/changelog/dlp/2025-05-12-case-sensitive-cwl.mdx
index 96c56ed78ca37b..14aa3e39e94dfa 100644
--- a/src/content/changelog/dlp/2025-05-12-case-sensitive-cwl.mdx
+++ b/src/content/changelog/dlp/2025-05-12-case-sensitive-cwl.mdx
@@ -4,6 +4,6 @@ description: Custom Word Lists can now be configured to enforce case sensitivity
date: 2025-05-12T11:00:00Z
---
-You can now configure [custom word lists](/cloudflare-one/policies/data-loss-prevention/datasets/#custom-wordlist) to enforce case sensitivity. This setting supports flexibility where needed and aims to reduce false positives where letter casing is critical.
+You can now configure [custom word lists](/cloudflare-one/policies/data-loss-prevention/detection-entries/#custom-wordlist) to enforce case sensitivity. This setting supports flexibility where needed and aims to reduce false positives where letter casing is critical.

diff --git a/src/content/docs/cloudflare-one/applications/casb/casb-dlp.mdx b/src/content/docs/cloudflare-one/applications/casb/casb-dlp.mdx
index 65d70c2d84bf6f..35e07829a1fe17 100644
--- a/src/content/docs/cloudflare-one/applications/casb/casb-dlp.mdx
+++ b/src/content/docs/cloudflare-one/applications/casb/casb-dlp.mdx
@@ -19,7 +19,7 @@ You can use [Cloudflare Data Loss Prevention (DLP)](/cloudflare-one/policies/dat
## Configure a DLP profile
-You may either use DLP profiles predefined by Cloudflare, or create your own custom profiles based on regex, predefined detection entries, and DLP datasets.
+You may either use DLP profiles predefined by Cloudflare, or create your own custom profiles based on regex, predefined detection entries, datasets, and document fingerprints.
### Configure a predefined profile
diff --git a/src/content/docs/cloudflare-one/changelog/dlp.mdx b/src/content/docs/cloudflare-one/changelog/dlp.mdx
index 28cfd3bb8b00d2..887048d91c7bc7 100644
--- a/src/content/docs/cloudflare-one/changelog/dlp.mdx
+++ b/src/content/docs/cloudflare-one/changelog/dlp.mdx
@@ -35,7 +35,7 @@ In addition to [logging the payload](/cloudflare-one/policies/data-loss-preventi
**Exact Data Match multi-entry upload support**
-You can now upload files with [multiple columns of data](/cloudflare-one/policies/data-loss-prevention/datasets/#upload-a-new-dataset) as Exact Data Match datasets. DLP can use each column as a separate existing detection entry.
+You can now upload files with [multiple columns of data](/cloudflare-one/policies/data-loss-prevention/detection-entries/#upload-a-new-dataset) as Exact Data Match datasets. DLP can use each column as a separate existing detection entry.
## 2024-05-23
diff --git a/src/content/docs/cloudflare-one/policies/data-loss-prevention/datasets.mdx b/src/content/docs/cloudflare-one/policies/data-loss-prevention/datasets.mdx
deleted file mode 100644
index 790f5b355953ae..00000000000000
--- a/src/content/docs/cloudflare-one/policies/data-loss-prevention/datasets.mdx
+++ /dev/null
@@ -1,96 +0,0 @@
----
-pcx_content_type: concept
-title: DLP datasets
-sidebar:
- order: 4
----
-
-import { Details } from "~/components";
-
-Cloudflare DLP can scan your web traffic and SaaS applications for specific data defined in a custom dataset. Sensitive data can be hashed before reaching Cloudflare and redacted from matches in [payload logs](/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options/#log-the-payload-of-matched-rules).
-
-## DLP dataset types
-
-### Exact Data Match
-
-Exact Data Match (EDM) protects sensitive information, such as names, addresses, phone numbers, and credit card numbers.
-
-All data in uploaded EDM datasets is encrypted before reaching Cloudflare. To detect matches, Cloudflare hashes traffic and compares it to hashes from your dataset. Matched data will be redacted in payload logs.
-
-### Custom Wordlist
-
-Custom Wordlist (CWL) protects non-sensitive data, such as intellectual property and SKU numbers. Optionally, CWL can detect case-sensitive data.
-
-Cloudflare stores data from CWL datasets within DLP. Plaintext matches appear in payload logs.
-
-## Use DLP datasets
-
-### Prepare a dataset
-
-#### Formatting
-
-To prepare a dataset for DLP, add your desired data to a single-column spreadsheet. Each line must be at least six characters long. Entries do not require trailing or final commas.
-
-For compatibility, save your file in either `.csv` or `.txt` format with LF (`\n`) newline characters. DLP does not support CRLF (`\r\n`) newline characters. For information on dataset limits, refer to [Account limits](/cloudflare-one/account-limits/#data-loss-prevention-dlp).
-
-#### Column title cells
-
-Column title cells may result in false positives in Custom Wordlist datasets and should be removed.
-
-DLP will detect and use title cells as column names for Exact Data Match datasets. If multiple columns have the same name, DLP will append a number sign (`#`) and number to their names.
-
-:::tip[Update EDM datasets]
-
-To select which Exact Data Match columns to use, you will need to [reupload any EDM datasets](#manage-existing-datasets) added prior to column support.
-
-:::
-
-### Upload a new dataset
-
-
-
-1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **DLP datasets**.
-2. Select **Create new dataset**.
-3. Choose **Exact Data Match**.
-4. Upload your dataset file. Select **Next**.
-5. Review and choose the detected columns you want to include. Select **Next**.
-6. Name your dataset. Optionally, add a description. Select **Next**.
-7. Review the details for your uploaded dataset. Select **Save dataset**.
-
-DLP will encrypt your dataset and save its hash.
-
-
-
-
-
-1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **DLP datasets**.
-2. Select **Create new dataset**.
-3. Choose **Custom Wordlist**.
-4. Name your dataset. Optionally, add a description.
-5. (Optional) In **Settings**, turn on **Enforce case sensitivity** to require matched values to contain exact capitalization.
-6. In **Upload file**, choose your dataset file.
-7. Select **Save**.
-
-DLP will save your dataset in cleartext.
-
-
-
-To use your uploaded dataset, add it as an existing entry to a [custom DLP profile](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#build-a-custom-profile).
-
-### Manage existing datasets
-
-Uploaded DLP datasets are read-only. To update a dataset, you must upload a new file to replace the original.
-
-1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **DLP datasets**.
-2. Select the dataset you want to update.
-3. Select **Upload dataset** and choose your updated dataset. Select **Next**.
-4. If your select dataset is an Exact Data Match dataset, review and choose the new columns. Select **Next**.
-5. Select **Save dataset**.
-
-Your new dataset will replace the original dataset.
-
-:::caution[Remove existing column entries]
-
-If you want to update an Exact Data Match dataset to remove a column in use as an [existing detection entry](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#build-a-custom-profile), you must remove the existing entry from any custom DLP profiles using it before updating the dataset.
-
-:::
diff --git a/src/content/docs/cloudflare-one/policies/data-loss-prevention/detection-entries.mdx b/src/content/docs/cloudflare-one/policies/data-loss-prevention/detection-entries.mdx
new file mode 100644
index 00000000000000..f2ce00cd9d149d
--- /dev/null
+++ b/src/content/docs/cloudflare-one/policies/data-loss-prevention/detection-entries.mdx
@@ -0,0 +1,134 @@
+---
+pcx_content_type: concept
+title: Detection entries
+sidebar:
+ order: 4
+---
+
+import { Details } from "~/components";
+
+Cloudflare DLP can scan your web traffic and SaaS applications for specific data defined in custom detection entries. Detection entries allow you to define custom data patterns for DLP to detect using [DLP profiles](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/). Detection entries include custom [datasets](#datasets) with defined data and [document entries](#documents) with example fingerprints.
+
+You can configure sensitive data to be hashed before reaching Cloudflare and redacted from matches in [payload logs](/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options/#log-the-payload-of-matched-rules).
+
+## Datasets
+
+You can create and upload custom datasets to scan for specific matching data.
+
+### Dataset types
+
+#### Exact Data Match
+
+Exact Data Match (EDM) protects sensitive information, such as names, addresses, phone numbers, and credit card numbers.
+
+All data in uploaded EDM datasets is encrypted before reaching Cloudflare. To detect matches, Cloudflare hashes traffic and compares it to hashes from your dataset. Matched data will be redacted in payload logs.
+
+#### Custom Wordlist
+
+Custom Wordlist (CWL) protects non-sensitive data, such as intellectual property and SKU numbers. Optionally, CWL can detect case-sensitive data.
+
+Cloudflare stores data from CWL datasets within DLP. Plaintext matches appear in payload logs.
+
+### Prepare DLP datasets
+
+#### Formatting
+
+To prepare a dataset for DLP, add your desired data to a single-column spreadsheet. Each line must be at least six characters long. Entries do not require trailing or final commas.
+
+For compatibility, save your file in either `.csv` or `.txt` format with LF (`\n`) newline characters. DLP does not support CRLF (`\r\n`) newline characters. For information on dataset limits, refer to [Account limits](/cloudflare-one/account-limits/#data-loss-prevention-dlp).
+
+#### Column title cells
+
+Column title cells may result in false positives in Custom Wordlist datasets and should be removed.
+
+DLP will detect and use title cells as column names for Exact Data Match datasets. If multiple columns have the same name, DLP will append a number sign (`#`) and number to their names.
+
+:::caution[Update EDM datasets]
+To select which Exact Data Match columns to use, you will need to [reupload any EDM datasets](#manage-existing-datasets) added prior to column support.
+:::
+
+### Upload a new dataset
+
+
+
+1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **Detection entries**.
+2. Go to **Datasets**.
+3. Select **Add a dataset**. In **Exact Data Match (EDM)**, choose **Select**.
+4. Upload your dataset file. Select **Next**.
+5. Review and choose the detected columns you want to include. Select **Next**.
+6. Name your dataset. Optionally, add a description. Select **Next**.
+7. Review the details for your uploaded dataset. Select **Save dataset**.
+
+DLP will encrypt your dataset and save its hash.
+
+
+
+
+
+1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **Detection entries**.
+2. Go to **Datasets**.
+3. Select **Add a dataset**. In **Custom Wordlist (CWL)**, choose **Select**.
+4. Name your dataset. Optionally, add a description.
+5. (Optional) In **Settings**, turn on **Enforce case sensitivity** to require matched values to contain exact capitalization.
+6. In **Upload file**, choose your dataset file.
+7. Select **Save**.
+
+DLP will save your dataset in cleartext.
+
+
+
+The dataset will appear in the list with an **Uploading** status. Once the upload is complete, the status will change to **Complete**. To use your uploaded dataset, add it as an existing entry to a [custom DLP profile](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#build-a-custom-profile).
+
+### Manage existing datasets
+
+Uploaded DLP datasets are read-only. To update a dataset, you must upload a new file to replace the original.
+
+1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **DLP datasets**.
+2. Select the dataset you want to update.
+3. Select **Upload dataset** and choose your updated dataset. Select **Next**.
+4. If your select dataset is an Exact Data Match dataset, review and choose the new columns. Select **Next**.
+5. Select **Save dataset**.
+
+Your new dataset will replace the original dataset.
+
+:::caution[Remove existing column entries]
+If you want to update an Exact Data Match dataset to remove a column in use as an [existing detection entry](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#build-a-custom-profile), you must remove the existing entry from any custom DLP profiles using it before updating the dataset.
+:::
+
+## Documents
+
+You can upload example documents to scan for unstructured data or specific document types common to your organization. DLP will create a unique fingerprint of the document and detect patterns in your organization's traffic based on how similar it is to the original fingerprint.
+
+DLP stores uploaded documents encrypted at rest in a [Cloudflare R2](/r2/) bucket. To upload sensitive data that is only stored in memory, use [Exact Data Match](#exact-data-match).
+
+### Prepare document entries
+
+DLP supports documents in `.docx` and `.txt` format. Documents must be under 10 MB.
+
+### Upload a new document entry
+
+To upload a new document entry to DLP:
+
+1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **Detection entries**.
+2. Go to **Documents**.
+3. Select **Add a document entry**.
+4. Name your document. Optionally, add a description.
+5. In **Minimum similarity for matches**, enter a value between 0% and 100%.
+6. In **Upload document**, choose and upload your document file.
+7. Select **Save**.
+
+The document will appear in the list with a **Pending** status. Once the upload is complete, the status will change to **Complete**. If you created a document entry with Terraform, the status will be **No file** until you upload a file.
+
+To use your uploaded document fingerprint, add it as an existing entry to a [custom DLP profile](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#build-a-custom-profile).
+
+### Manage existing document entries
+
+Uploaded document entries are read-only. To update a document entry, you must upload a new file to replace the original.
+
+1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **DLP** > **Detection entries**.
+2. Choose the document you want to update and select **Edit**.
+3. (Optional) Update the name and minimum similarity for matches for your document entry. You can also open the existing uploaded document.
+4. In **Update document entry**, choose and upload your updated document file.
+5. Select **Save**.
+
+Your new document entry will replace the original document entry. If your file upload fails, DLP will still use the original document fingerprint to scan traffic until you delete the entry.
diff --git a/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx b/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx
index e4565f837f631b..7613220cb4c77b 100644
--- a/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx
+++ b/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx
@@ -67,7 +67,7 @@ Based on your report, DLP's machine learning will adjust its confidence in futur
- All Cloudflare logs are encrypted at rest. Encrypting the payload content adds a second layer of encryption for the matched values that triggered a DLP rule.
- Cloudflare cannot decrypt encrypted payloads, since this operation requires your private key. Cloudflare staff will never ask for the private key.
- DLP will redact all predefined alphanumeric characters in the log. For example, `123-45-6789` will become `XXX-XX-XXXX`.
- - You can define sensitive data with [Exact Data Match (EDM)](/cloudflare-one/policies/data-loss-prevention/datasets/#exact-data-match). EDM match logs will redact your defined strings.
+ - You can define sensitive data with [Exact Data Match (EDM)](/cloudflare-one/policies/data-loss-prevention/detection-entries/#exact-data-match). EDM match logs will redact your defined strings.
## Send HTTP requests to Logpush destination
diff --git a/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-profiles/index.mdx b/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-profiles/index.mdx
index 178a917dcf7440..f0bbf0804eefd1 100644
--- a/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-profiles/index.mdx
+++ b/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-profiles/index.mdx
@@ -8,7 +8,7 @@ sidebar:
import { Render } from "~/components";
-A DLP profile is a collection of detection entries (regular expressions and [DLP datasets](/cloudflare-one/policies/data-loss-prevention/datasets/)) that define the data patterns you want to detect. Cloudflare DLP provides predefined profiles for common detections, or you can build custom DLP profiles specific to your data, organization, and risk tolerance.
+A DLP profile is a collection of regular expressions and [detection entries](/cloudflare-one/policies/data-loss-prevention/detection-entries/) that define the data patterns you want to detect. Cloudflare DLP provides predefined profiles for common detections, or you can build custom DLP profiles specific to your data, organization, and risk tolerance.
## Configure a predefined profile
diff --git a/src/content/docs/data-localization/compatibility.mdx b/src/content/docs/data-localization/compatibility.mdx
index 44cd7c2a2b1572..b18d704a94c3bc 100644
--- a/src/content/docs/data-localization/compatibility.mdx
+++ b/src/content/docs/data-localization/compatibility.mdx
@@ -6,7 +6,6 @@ sidebar:
head:
- tag: title
content: Cloudflare product compatibility
-
---
The table below provides a summary of the Data Localization Suite product's behavior with Cloudflare products. Refer to the table legend for guidance on interpreting the table.
@@ -18,140 +17,179 @@ The table below provides a summary of the Data Localization Suite product's beha
## Application Performance
-| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
-| ---------------------- | --------------- | ----------------- | -------------------------- |
-| Caching/CDN | ✅ | ✅ | ✅ |
-| Cache Reserve | ⚫️ | 🚧 | ✅ [^29] |
-| DNS | ⚫️ | 🚧 [^33] | 🚧 [^32] |
-| HTTP/3 (with QUIC) | ⚫️ | ✘ | ⚫️ |
-| Image Resizing | ✅ | ✅ [^6] | 🚧 [^1] |
-| Load Balancing | ✅ | ✅ | 🚧 [^1] |
-| Onion Routing | ✘ | ✘ | ✘ |
-| Orange-to-Orange (O2O) | ✘ | ✘ | ✘ |
-| Stream Delivery | ✅ | ✅ | ✅ |
-| Tiered Caching | ✅ | 🚧 [^2] | 🚧 [^30] |
-| Trace | ✘ | ✘ | ✘ |
-| Waiting Room | ⚫️ | ✅ | ✅ |
-| Web Analytics / Real User Monitoring (RUM) | ⚫️ | ⚫️ | ✘ [^43] |
-| Zaraz | ✅ | ✅ | ✅ |
-
-***
+| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
+| ------------------------------------------ | --------------- | ----------------- | -------------------------- |
+| Caching/CDN | ✅ | ✅ | ✅ |
+| Cache Reserve | ⚫️ | 🚧 | ✅ [^29] |
+| DNS | ⚫️ | 🚧 [^33] | 🚧 [^32] |
+| HTTP/3 (with QUIC) | ⚫️ | ✘ | ⚫️ |
+| Image Resizing | ✅ | ✅ [^6] | 🚧 [^1] |
+| Load Balancing | ✅ | ✅ | 🚧 [^1] |
+| Onion Routing | ✘ | ✘ | ✘ |
+| Orange-to-Orange (O2O) | ✘ | ✘ | ✘ |
+| Stream Delivery | ✅ | ✅ | ✅ |
+| Tiered Caching | ✅ | 🚧 [^2] | 🚧 [^30] |
+| Trace | ✘ | ✘ | ✘ |
+| Waiting Room | ⚫️ | ✅ | ✅ |
+| Web Analytics / Real User Monitoring (RUM) | ⚫️ | ⚫️ | ✘ [^43] |
+| Zaraz | ✅ | ✅ | ✅ |
+
+---
## Application Security
| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
| ---------------------------- | --------------- | ----------------- | -------------------------- |
-| Advanced Certificate Manager | ⚫️ | ⚫️ | ⚫️ |
-| Advanced DDoS Protection | ✅ | ✅ | 🚧 [^3] |
+| Advanced Certificate Manager | ⚫️ | ⚫️ | ⚫️ |
+| Advanced DDoS Protection | ✅ | ✅ | 🚧 [^3] |
| API Shield | ✅ | ✅ | 🚧 [^4] |
-| Bot Management | ✅ | ✅ | 🚧 [^5] |
-| DNS Firewall | ⚫️ | ⚫️ | 🚧 [^22] |
-| Page Shield | ✅ | ✅ | ✅ |
+| Bot Management | ✅ | ✅ | 🚧 [^5] |
+| DNS Firewall | ⚫️ | ⚫️ | 🚧 [^22] |
+| Page Shield | ✅ | ✅ | ✅ |
| Rate Limiting | ✅ | ✅ | ✅ [^37] |
-| SSL | ✅ | ✅ | ✅ |
-| Cloudflare for SaaS | ✘ | ✅ | ✅ |
-| Turnstile | ⚫️ | ✘ | ✅ [^38] |
-| WAF/L7 Firewall | ✅ | ✅ | ✅ |
-| DMARC Management | ⚫️ | ⚫️ | ✅ |
+| SSL | ✅ | ✅ | ✅ |
+| Cloudflare for SaaS | ✘ | ✅ | ✅ |
+| Turnstile | ⚫️ | ✘ | ✅ [^38] |
+| WAF/L7 Firewall | ✅ | ✅ | ✅ |
+| DMARC Management | ⚫️ | ⚫️ | ✅ |
-***
+---
## Developer Platform
-| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
-| ---------------------------- | --------------- | ----------------- | -------------------------- |
-| Cloudflare Images | ⚫️ | ✅ [^36] | 🚧 [^35] |
-| AI Gateway | ✘ | ✘ | 🚧 [^39] |
-| Cloudflare Pages | ✅ [^11] | ✅ [^11] | 🚧 [^1] |
-| Cloudflare D1 | ⚫️ | ⚫️ | 🚧 [^40] |
-| Durable Objects | ⚫️ | ✅ [^7] | 🚧 [^1] |
-| Email Routing | ⚫️ | ⚫️ | ✅ |
-| R2 | ✅ [^27] | ✅ [^8] | ✅ [^28] |
-| Smart Placement | ⚫️ | ✘ | ✘ |
-| Stream | ⚫️ | ✘ | 🚧 [^1] |
-| Workers (deployed on a Zone) | ✅ | ✅ | 🚧 [^41] |
-| Workers AI | ⚫️ | ✘ | ✅ |
-| Workers KV | ⚫️ | ✘ | ✅ [^34] |
-| Workers.dev | ✘ | ✘ | ✘ |
-| Workers Analytics Engine (WAE) | ⚫️ | ⚫️ | 🚧 [^1] |
-
-***
+| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
+| ------------------------------ | --------------- | ----------------- | -------------------------- |
+| Cloudflare Images | ⚫️ | ✅ [^36] | 🚧 [^35] |
+| AI Gateway | ✘ | ✘ | 🚧 [^39] |
+| Cloudflare Pages | ✅ [^11] | ✅ [^11] | 🚧 [^1] |
+| Cloudflare D1 | ⚫️ | ⚫️ | 🚧 [^40] |
+| Durable Objects | ⚫️ | ✅ [^7] | 🚧 [^1] |
+| Email Routing | ⚫️ | ⚫️ | ✅ |
+| R2 | ✅ [^27] | ✅ [^8] | ✅ [^28] |
+| Smart Placement | ⚫️ | ✘ | ✘ |
+| Stream | ⚫️ | ✘ | 🚧 [^1] |
+| Workers (deployed on a Zone) | ✅ | ✅ | 🚧 [^41] |
+| Workers AI | ⚫️ | ✘ | ✅ |
+| Workers KV | ⚫️ | ✘ | ✅ [^34] |
+| Workers.dev | ✘ | ✘ | ✘ |
+| Workers Analytics Engine (WAE) | ⚫️ | ⚫️ | 🚧 [^1] |
+
+---
## Network Services
-| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
-| ------------------ | --------------- | ----------------- | -------------------------- |
-| Argo Smart Routing | ✅ | ✘ [^9] | ✘ [^10] |
-| Static IP/BYOIP | ⚫️ | ✅ [^26] | ⚫️ |
-| Magic Firewall | ⚫️ | ⚫️ | ✅ |
-| Magic Network Monitoring | ⚫️ | ⚫️ | 🚧 [^1] |
-| Magic Transit | ⚫️ | ⚫️ | 🚧 [^1] |
-| Magic WAN | ⚫️ | ⚫️ | ✅ |
-| Spectrum | ✅ | ✅ [^42] | ✅ |
+| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
+| ------------------------ | --------------- | ----------------- | -------------------------- |
+| Argo Smart Routing | ✅ | ✘ [^9] | ✘ [^10] |
+| Static IP/BYOIP | ⚫️ | ✅ [^26] | ⚫️ |
+| Magic Firewall | ⚫️ | ⚫️ | ✅ |
+| Magic Network Monitoring | ⚫️ | ⚫️ | 🚧 [^1] |
+| Magic Transit | ⚫️ | ⚫️ | 🚧 [^1] |
+| Magic WAN | ⚫️ | ⚫️ | ✅ |
+| Spectrum | ✅ | ✅ [^42] | ✅ |
-***
+---
## Platform
-| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
-| ------- | --------------- | ----------------- | -------------------------- |
-| Logpull | ⚫️ | ⚫️ | 🚧 [^12] |
-| Logpush | ⚫️ | ✅ | 🚧 [^13] |
-| Log Explorer | ⚫️ | ⚫️ | ✘ [^23] |
+| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
+| ------------ | --------------- | ----------------- | -------------------------- |
+| Logpull | ⚫️ | ⚫️ | 🚧 [^12] |
+| Logpush | ⚫️ | ✅ | 🚧 [^13] |
+| Log Explorer | ⚫️ | ⚫️ | ✘ [^23] |
-***
+---
## Zero Trust
| Product | Geo Key Manager | Regional Services | Customer Metadata Boundary |
| ----------------- | --------------- | ----------------- | -------------------------- |
-| Access | 🚧 [^14] | 🚧 [^15] | 🚧 [^16] |
-| Browser Isolation | ⚫️ | 🚧 [^17] | ✅ |
-| CASB | ⚫️ | ⚫️ | ✘ |
-| Cloudflare Tunnel | ⚫️ | 🚧 [^18] | ⚫️ |
-| DLP | ⚫️ [^19] | ⚫️ [^19] | 🚧 [^31] |
-| Gateway | 🚧 [^20] | 🚧 [^21] | 🚧 [^22] |
-| WARP | ⚫️ | ⚫️ | 🚧 [^1] |
-
+| Access | 🚧 [^14] | 🚧 [^15] | 🚧 [^16] |
+| Browser Isolation | ⚫️ | 🚧 [^17] | ✅ |
+| CASB | ⚫️ | ⚫️ | ✘ |
+| Cloudflare Tunnel | ⚫️ | 🚧 [^18] | ⚫️ |
+| DLP | ⚫️ [^19] | ⚫️ [^19] | 🚧 [^31] |
+| Gateway | 🚧 [^20] | 🚧 [^21] | 🚧 [^22] |
+| WARP | ⚫️ | ⚫️ | 🚧 [^1] |
[^1]: Logs / Analytics not available outside US region when using Customer Metadata Boundary.
+
[^2]: Regular and Custom Tiered Cache works; Smart Tiered Caching not available with Regional Services.
+
[^3]: Adaptive DDoS Protection is only supported for US CMB.
+
[^4]: Features such as API Discovery and Volumetric Abuse Detection will not work with CMB set to EU only.
+
[^5]: Some advanced Enterprise features, including the [Anomaly Detection engine](/bots/concepts/bot-score/#anomaly-detection), are not available.
+
[^6]: Only when using a Custom Domain set to a region, either through Workers or [Transform Rules](/images/transform-images/serve-images-custom-paths/) within the same zone.
+
[^7]: [Jurisdiction restrictions for Durable Objects](/durable-objects/reference/data-location/#restrict-durable-objects-to-a-jurisdiction).
+
[^8]: Only when using a [Custom Domain](/r2/buckets/public-buckets/#connect-a-bucket-to-a-custom-domain) set to a region and using [jurisdictions with the S3 API](/r2/reference/data-location/#using-jurisdictions-with-the-s3-api).
+
[^9]: Argo cannot be used with Regional Services.
+
[^10]: Argo cannot be used with Customer Metadata Boundary.
+
[^11]: Only when using [Custom Domain](/pages/configuration/custom-domains/) set to a region.
+
[^12]: Logpull not available when using Customer Metadata Boundary outside US region. Logs may be stored and retrieved with [Logs Engine](https://blog.cloudflare.com/announcing-logs-engine/) which is adding region support in 2025.
+
[^13]: Logpush available with Customer Metadata Boundary for [these datasets](/data-localization/metadata-boundary/logpush-datasets/). Contact your account team if you need another dataset.
+
[^14]: Access App SSL keys can use Geo Key Manager. [Access JWT](/cloudflare-one/identity/authorization-cookie/validating-json/) is not yet localized.
+
[^15]: Can be localized to US FedRAMP Moderate Domestic region only.
+
[^16]: Customer Metadata Boundary can be used to limit data transfer outside region, but Access User Logs will not be available outside US region.
+
[^17]: Currently may only be used with US FedRAMP region.
+
[^18]: When Cloudflare Tunnel connects to Cloudflare, the connectivity options available are the Global Region (default) and [US FedRAMP Moderate Domestic region](/cloudflare-one/connections/connect-networks/configure-tunnels/cloudflared-parameters/run-parameters/#region). For incoming requests to the Cloudflare Edge, Regional Services only applies when using [Public Hostnames](/cloudflare-one/connections/connect-networks/routing-to-tunnel/). In this case, the region associated with the DNS record will apply.
+
[^19]: Uses Gateway HTTP and CASB.
+
[^20]: You can [bring your own certificate](https://blog.cloudflare.com/bring-your-certificates-cloudflare-gateway/) to Gateway but these cannot yet be restricted to a specific region.
+
[^21]: Gateway HTTP supports Regional Services. Gateway DNS does not yet support regionalization.
ICMP proxy and WARP-to-WARP proxy are not available to Regional Services users.
+
[^22]: Dashboard Analytics and Logs are empty when using CMB outside the US region. Use Logpush instead.
+
[^23]: Currently, customers do not have the ability to choose the location of the Cloudflare-managed R2 bucket for Log Explorer.
+
[^26]: Static IP/BYOIP can be used with the legacy Spectrum setup.
+
[^27]: Only when using a Custom Domain and a [Custom Certificate](/r2/reference/data-security/#encryption-in-transit) or [Keyless SSL](/ssl/keyless-ssl/).
+
[^28]: R2 Dashboard [Metrics and Analytics](/r2/platform/metrics-analytics/) are populated. Additionally, [Jurisdictional Restrictions](/r2/reference/data-location/#jurisdictional-restrictions) guarantee objects in a bucket are stored within a specific jurisdiction.
+
[^29]: You cannot yet specify region location for object storage itself.
+
[^30]: Regular/Generic and Custom Tiered Cache works; Smart Tiered Caching does not work with Customer Metadata Boundary (CMB).
With CMB set to EU, the Zone Dashboard **Caching** > **Tiered Cache** > **Smart Tiered Caching** option will not populate the Dashboard Analytics.
-[^31]: DLP is part of Gateway HTTP, however, [DLP datasets](/cloudflare-one/policies/data-loss-prevention/datasets/#use-dlp-datasets) are not available outside US region when using Customer Metadata Boundary.
+
+[^31]: DLP is part of Gateway HTTP, however, [DLP detection entries](/cloudflare-one/policies/data-loss-prevention/detection-entries/) are not available outside US region when using Customer Metadata Boundary.
+
[^32]: Dashboard Analytics are empty when using CMB outside the US region. Use Logpush instead.
+
[^33]: [Outgoing zone transfers](/dns/zone-setups/zone-transfers/cloudflare-as-primary/) will carry Earth region proxy IPs, thus making regional service dysfunctional when non-Cloudflare nameservers respond to the DNS queries.
+
[^34]: Jurisdictional Restrictions (storage) for Workers KV pairs is not supported today.
+
[^35]: Logs / Analytics not available outside US region when using Customer Metadata Boundary. Jurisdictional Restrictions (storage) options are not supported today.
+
[^36]: Only when using a [Custom Domain](/images/manage-images/serve-images/serve-from-custom-domains/) set to a region.
+
[^37]: Legacy Zone Analytics & Logs section not available outside US region when using CMB. Use [Security Analytics](/waf/analytics/security-analytics/) instead.
+
[^38]: [Turnstile Analytics](/turnstile/turnstile-analytics/) are available. However, there are no regionalization guarantees for the Siteverify API yet.
+
[^39]: Jurisdictional Restrictions (storage) options for [Logs](/ai-gateway/observability/logging/) are not supported today.
+
[^40]: Jurisdictional Restrictions ([data location](/d1/configuration/data-location/) / storage) options are not supported today.
+
[^41]: Logs / Analytics not available outside US region when using Customer Metadata Boundary. Use Logpush instead.
+
[^42]: Only applies to HTTP/S Spectrum applications.
+
[^43]: Web Analytics collects the [minimum amount of information](/web-analytics/data-metrics/data-origin-and-collection/). Alternatively, you can [exclude EU Visitors from RUM](/speed/speed-test/rum-beacon/#rum-excluding-eeaeu).
diff --git a/src/content/docs/learning-paths/secure-internet-traffic/build-http-policies/data-loss-prevention.mdx b/src/content/docs/learning-paths/secure-internet-traffic/build-http-policies/data-loss-prevention.mdx
index 26a123ce42441e..ec787e847073c9 100644
--- a/src/content/docs/learning-paths/secure-internet-traffic/build-http-policies/data-loss-prevention.mdx
+++ b/src/content/docs/learning-paths/secure-internet-traffic/build-http-policies/data-loss-prevention.mdx
@@ -118,7 +118,7 @@ curl https://api.cloudflare.com/client/v4/accounts/$ACCOUNT_ID/gateway/rules \
#### DLP datasets
-If your data is a distinct [dataset](/cloudflare-one/policies/data-loss-prevention/datasets/) you have defined, you can build a profile by uploading a database to use in an Exact Data Match or Custom Wordlist function. Exact Data Match and Custom Wordlist feature some key differences:
+If your data is a distinct [dataset](/cloudflare-one/policies/data-loss-prevention/detection-entries/#datasets) you have defined, you can build a profile by uploading a database to use in an Exact Data Match or Custom Wordlist function. Exact Data Match and Custom Wordlist feature some key differences:
| | Exact Data Match | Custom Wordlist |
| ------------------- | ------------------------------------------------------- | ------------------------------------------------------------------ |
diff --git a/src/content/docs/reference-architecture/diagrams/security/securing-data-in-transit.mdx b/src/content/docs/reference-architecture/diagrams/security/securing-data-in-transit.mdx
index d9328788b76109..fee4581c001e7f 100644
--- a/src/content/docs/reference-architecture/diagrams/security/securing-data-in-transit.mdx
+++ b/src/content/docs/reference-architecture/diagrams/security/securing-data-in-transit.mdx
@@ -46,7 +46,7 @@ When traffic from the device, to the hosted application, all flows via Cloudflar
A common challenge is trying to determine what data is sensitive and requires policy intervention. Data Loss Prevention services are used to inspect the contents of a piece of traffic, and then provide metadata to the policy to impact enforcement.
-For example, when a user attempts to upload a file to a SaaS application and the traffic route has been configured to always go via the Cloudflare network, [Cloudflare DLP](/cloudflare-one/policies/data-loss-prevention/) inspects the file by using DLP profiles assigned to a Gateway policy. After a DLP profile matches, the Gateway policy will allow or block the traffic, and the activity will be written to the logs. A DLP profile is a collection of regular expressions (also known as detection entries) that define the data patterns you want to detect. Cloudflare DLP provides [predefined profiles](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#configure-a-predefined-profile) for common detections, or you can build [custom profiles](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#build-a-custom-profile) specific to your data, and even the ability to leverage [Exact Data Match](/cloudflare-one/policies/data-loss-prevention/datasets/#exact-data-match) (EDM).
+For example, when a user attempts to upload a file to a SaaS application and the traffic route has been configured to always go via the Cloudflare network, [Cloudflare DLP](/cloudflare-one/policies/data-loss-prevention/) inspects the file by using DLP profiles assigned to a Gateway policy. After a DLP profile matches, the Gateway policy will allow or block the traffic, and the activity will be written to the logs. A DLP profile is a collection of regular expressions (also known as detection entries) that define the data patterns you want to detect. Cloudflare DLP provides [predefined profiles](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#configure-a-predefined-profile) for common detections, or you can build [custom profiles](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/#build-a-custom-profile) specific to your data, and even the ability to leverage [Exact Data Match](/cloudflare-one/policies/data-loss-prevention/detection-entries/#exact-data-match) (EDM).
DLP profiles are then used in combination with other policy attributes to specifically identify the traffic, such as only enforcing the policy when sensitive data is being uploaded to approved Cloud based storage services.
diff --git a/src/content/partials/cloudflare-one/data-loss-prevention/custom-profile.mdx b/src/content/partials/cloudflare-one/data-loss-prevention/custom-profile.mdx
index 8015bfd0783b8c..e672ea77b2f05c 100644
--- a/src/content/partials/cloudflare-one/data-loss-prevention/custom-profile.mdx
+++ b/src/content/partials/cloudflare-one/data-loss-prevention/custom-profile.mdx
@@ -28,7 +28,7 @@ import { Details } from "~/components";
- Existing entries include [predefined detection entries](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/predefined-profiles/) and [DLP datasets](/cloudflare-one/policies/data-loss-prevention/datasets/).
+ Existing entries include [predefined](/cloudflare-one/policies/data-loss-prevention/dlp-profiles/predefined-profiles/) and [user-defined](/cloudflare-one/policies/data-loss-prevention/detection-entries/) detection entries.
1. Select **Add existing entries**.
2. Choose which entries you want to add, then select **Confirm**.