Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion public/__redirects
Original file line number Diff line number Diff line change
Expand Up @@ -965,7 +965,7 @@
/logs/logpush/logpush-dashboard/ /logs/logpush/logpush-job/enable-destinations/ 301
/logs/logpush/s3-compatible-endpoints/ /logs/logpush/logpush-job/enable-destinations/s3-compatible-endpoints/ 301
/logs/reference/logpush-api-configuration/ /logs/get-started/api-configuration/ 301
/logs/reference/logpush-api-configuration/filters/ /logs/reference/filters/ 301
/logs/reference/logpush-api-configuration/filters/ /logs/logpush/logpush-job/filters/ 301
# Non-slashed version is being used in the Cloudflare dashboard
/logs/reference/logpush-api-configuration/examples/example-logpush-curl/ /logs/logpush/examples/example-logpush-curl/ 301
/logs/log-explorer/ /log-explorer/log-search/ 301
Expand All @@ -978,6 +978,7 @@
/logs/tutorials/examples/example-logpush-python/ /logs/logpush/examples/example-logpush-python/ 301
/logs/get-started/alerts-and-analytics/ /logs/logpush/alerts-and-analytics/ 301
/logs/edge-log-delivery/ /logs/logpush/logpush-job/edge-log-delivery/ 301
/logs/reference/filters/ /logs/logpush/logpush-job/filters/ 301

# magic-firewall
/magic-firewall/reference/examples/ /magic-firewall/how-to/add-rules/ 301
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Build custom dashboards to share this information by specifying an individual cu

[Logpush](/logs/logpush/) sends metadata from Cloudflare products to your cloud storage destination or SIEM.

Using [filters](/logs/reference/filters/), you can send set sample rates (or not include logs altogether) based on filter criteria. This flexibility allows you to maintain selective logs for custom hostnames without massively increasing your log volume.
Using [filters](/logs/logpush/logpush-job/filters/), you can send set sample rates (or not include logs altogether) based on filter criteria. This flexibility allows you to maintain selective logs for custom hostnames without massively increasing your log volume.

Filtering is available for [all Cloudflare datasets](/logs/reference/log-fields/zone/).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ To configure Logpush for Zero Trust logs:
4. Choose a [Logpush destination](/logs/logpush/logpush-job/enable-destinations/).
5. Follow the service-specific instructions to configure and validate your destination.
6. Choose the [Zero Trust datasets](#zero-trust-datasets) to export.
7. Enter a **Job name**, any [filters](/logs/reference/filters/) you would like to add, and the data fields you want to include in the logs.
7. Enter a **Job name**, any [filters](/logs/logpush/logpush-job/filters/) you would like to add, and the data fields you want to include in the logs.
8. (Optional) In **Advanced settings**, choose the timestamp format you prefer and whether you want to enable log sampling.
9. Select **Submit**.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -92,4 +92,4 @@ To set up the DLP Forensic Copy Logpush job:

DLP will now send a copy of HTTP requests that match this policy to your Logpush destination.

Logpush supports up to four DLP Forensic Copy Logpush jobs per account. By default, Gateway will send all matched HTTP requests to your configured DLP Forensic Copy jobs. To send specific policy matches to specific jobs, configure [Log filters](/logs/reference/filters/). If the request contains an archive file, DLP will only send up to 100 MB of uncompressed content to your configured storage.
Logpush supports up to four DLP Forensic Copy Logpush jobs per account. By default, Gateway will send all matched HTTP requests to your configured DLP Forensic Copy jobs. To send specific policy matches to specific jobs, configure [Log filters](/logs/logpush/logpush-job/filters/). If the request contains an archive file, DLP will only send up to 100 MB of uncompressed content to your configured storage.
2 changes: 1 addition & 1 deletion src/content/docs/dns/internal-dns/analytics.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,4 @@ The [fields](/analytics/graphql-api/getting-started/querying-basics/) added to c

Leverage Logpush jobs for [Gateway DNS](/logs/reference/log-fields/account/gateway_dns/#internaldnsfallbackstrategy). For help setting up Logpush, refer to [Get started with Logs](/logs/get-started/).

You can also set up [Logpush filters](/logs/reference/filters/) to only push logs related to a specific [internal zone](/dns/internal-dns/internal-zones/) or [view](/dns/internal-dns/dns-views/) ID.
You can also set up [Logpush filters](/logs/logpush/logpush-job/filters/) to only push logs related to a specific [internal zone](/dns/internal-dns/internal-zones/) or [view](/dns/internal-dns/dns-views/) ID.
2 changes: 1 addition & 1 deletion src/content/docs/logs/get-started/api-configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -246,7 +246,7 @@ Response

## Filter

Use filters to select the events to include and/or remove from your logs. For more information, refer to [Filters](/logs/reference/filters/).
Use filters to select the events to include and/or remove from your logs. For more information, refer to [Filters](/logs/logpush/logpush-job/filters/).

## Sampling rate

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/logs/instant-logs.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Instant Logs has a maximum data rate supported. For high volume domains, we samp

- **Filters** - Use filters to drill down into specific events. Filters consist of three parts: key, operator and value.

All supported operators can be found in the [Filters](/logs/reference/filters/) page.
All supported operators can be found in the [Filters](/logs/logpush/logpush-job/filters/) page.

Below we have three examples of filters:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ When using Sumo Logic, you may find it helpful to have [Live Tail](https://help.
* Automated timestamp parsing within Sumo Logic; refer to [timestamps from Sumo Logic](https://help.sumologic.com/03Send-Data/Sources/04Reference-Information-for-Sources/Timestamps%2C-Time-Zones%2C-Time-Ranges%2C-and-Date-Formats) for details.
* **ownership\_challenge** - Challenge token required to prove destination ownership.
* **kind** (optional) - Used to differentiate between Logpush and Edge Log Delivery jobs. Refer to [Kind](/logs/get-started/api-configuration/#kind) for details.
* **filter** (optional) - Refer to [Filters](/logs/reference/filters/) for details.
* **filter** (optional) - Refer to [Filters](/logs/logpush/logpush-job/filters/) for details.

### Response

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ When you are done entering the destination details, select **Continue**.

9. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

10. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ When you are done entering the destination details, select **Continue**.

8. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ When you are done entering the destination details, select **Continue**.

8. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ When you are done entering the destination details, select **Continue**.

9. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

10. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Cloudflare expects that the endpoint is available over HTTPS, using a trusted ce

8. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ When you are done entering the destination details, select **Continue**.

8. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ When you are done entering the destination details, select **Continue**.

8. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ When you are done entering the destination details, select **Continue**.

8. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ When you are done entering the destination details, select **Continue**.
8. In the next step, you need to configure your logpush job:

- Enter the **Job name**.
- Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
- Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
- In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Cloudflare Logpush supports pushing logs directly to Sumo Logic via the Cloudfla

8. In the next step, you need to configure your logpush job:
* Enter the **Job name**.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/reference/filters/) for more information. Not all datasets have this option available.
* Under **If logs match**, you can select the events to include and/or remove from your logs. Refer to [Filters](/logs/logpush/logpush-job/filters/) for more information. Not all datasets have this option available.
* In **Send the following fields**, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.

9. In **Advanced Options**, you can:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ pcx_content_type: how-to
type: overview
title: Filters
sidebar:
order: 40
order: 5

---

Expand Down Expand Up @@ -95,6 +95,6 @@ To set filters through the dashboard:
3. Select **Add Logpush job**. A modal window will open.
4. Select the dataset you want to push to a storage service.
5. Below **Select data fields**, in the **Filter** section, you can set up your filters.
6. You need to select a [Field](/logs/reference/log-fields/), an [Operator](/logs/reference/filters/#logical-operators), and a **Value**.
6. You need to select a [Field](/logs/reference/log-fields/), an [Operator](/logs/logpush/logpush-job/filters/#logical-operators), and a **Value**.
7. You can connect more filters using `AND` and `OR` logical operators.
8. Select **Next** to continue the setting up of your Logpush job.
2 changes: 1 addition & 1 deletion src/content/docs/magic-firewall/how-to/filter-views.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ pcx_content_type: how-to

---

You can utilize different [Log filters](/logs/reference/filters/) to only view specific data from Magic Firewall.
You can utilize different [Log filters](/logs/logpush/logpush-job/filters/) to only view specific data from Magic Firewall.

## Filter by enabled or disabled rules

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/workers/observability/logs/logpush.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ curl "https://api.cloudflare.com/client/v4/accounts/<ACCOUNT_ID>/logpush/jobs" \
}' | jq .
```

In Logpush, you can configure [filters](/logs/reference/filters/) and a [sampling rate](/logs/get-started/api-configuration/#sampling-rate) to have more control of the volume of data that is sent to your configured destination. For example, if you only want to receive logs for requests that did not result in an exception, add the following `filter` JSON property below `output_options`:
In Logpush, you can configure [filters](/logs/logpush/logpush-job/filters/) and a [sampling rate](/logs/get-started/api-configuration/#sampling-rate) to have more control of the volume of data that is sent to your configured destination. For example, if you only want to receive logs for requests that did not result in an exception, add the following `filter` JSON property below `output_options`:

`"filter":"{\"where\": {\"key\":\"Outcome\",\"operator\":\"!eq\",\"value\":\"exception\"}}"`

Expand Down