Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion public/__redirects
Original file line number Diff line number Diff line change
Expand Up @@ -957,7 +957,6 @@
/logs/log-fields/ /logs/reference/log-fields/ 301
/logs/logpull-api/ /logs/logpull/ 301
/logs/logpull-api/requesting-logs/ /logs/logpull/requesting-logs/ 301
/logs/logpush/ /logs/about/ 301
/logs/logpush/aws-s3/ /logs/get-started/enable-destinations/aws-s3/ 301
/logs/logpush/azure/ /logs/get-started/enable-destinations/azure/ 301
/logs/logpush/google-cloud-storage/ /logs/get-started/enable-destinations/google-cloud-storage/ 301
Expand All @@ -971,6 +970,7 @@
/logs/reference/logpush-api-configuration/examples/example-logpush-curl/ /logs/tutorials/examples/example-logpush-curl/ 301
/logs/log-explorer/ /log-explorer/log-search/ 301
/logs/reference/glossary/ /logs/glossary/ 301
/logs/about/ /logs/logpush/ 301

# magic-firewall
/magic-firewall/reference/examples/ /magic-firewall/how-to/add-rules/ 301
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,6 @@ date: 2025-01-15
Only available on Enterprise plans.
:::

Cloudflare now allows you to send SSH command logs to storage destinations configured in [Logpush](/logs/about/), including third-party destinations. Once exported, analyze and audit the data as best fits your organization! For a list of available data fields, refer to the [SSH logs dataset](/logs/reference/log-fields/account/ssh_logs/).
Cloudflare now allows you to send SSH command logs to storage destinations configured in [Logpush](/logs/logpush/), including third-party destinations. Once exported, analyze and audit the data as best fits your organization! For a list of available data fields, refer to the [SSH logs dataset](/logs/reference/log-fields/account/ssh_logs/).

To set up a Logpush job, refer to [Logpush integration](/cloudflare-one/insights/logs/logpush/).
Original file line number Diff line number Diff line change
Expand Up @@ -121,4 +121,4 @@ This panel features metrics for Cloudflare Workers. To learn more, read [Cloudfl

### Logs

The Logs tab is not a metrics feature. Instead, Customers in the Enterprise plan can enable the [Cloudflare Logs Logpush](/logs/about/) service. You can use Logpush to download and analyze data using any analytics tool of your choice. 
The Logs tab is not a metrics feature. Instead, Customers in the Enterprise plan can enable the [Cloudflare Logs Logpush](/logs/logpush/) service. You can use Logpush to download and analyze data using any analytics tool of your choice. 
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Before sending your Cloudflare log data to Graylog, make sure that you:

* Have an existing Graylog installation. Both single-node and cluster configurations are supported
* Have a Cloudflare Enterprise account with Cloudflare Logs enabled
* Configure [Logpush](/logs/about/)
* Configure [Logpush](/logs/logpush/)

:::note[Note]

Expand All @@ -34,7 +34,7 @@ Cloudflare logs are HTTP/HTTPS request logs in JSON format and are gathered from

Before getting Cloudflare logs into Graylog:

1. Configure Cloudflare [Logpush](/logs/about/) to push logs with all desired fields to an AWS S3 bucket of your choice.
1. Configure Cloudflare [Logpush](/logs/logpush/) to push logs with all desired fields to an AWS S3 bucket of your choice.
2. Download the latest [Graylog Integration for Cloudflare](https://github.com/Graylog2/graylog-s3-lambda/blob/master/content-packs/cloudflare/cloudflare-logpush-content-pack.json).
3. Decompress the zip file.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Before sending your Cloudflare log data to Looker, make sure that you:

- Have an existing Looker account
- Have a Cloudflare Enterprise account with Cloudflare Logs enabled
- Configure [Logpush](/logs/about/) or [Logpull](/logs/logpull/)
- Configure [Logpush](/logs/logpush/) or [Logpull](/logs/logpull/)
- Load your data in a [database supported by Looker](https://looker.com/solutions/other-databases)

:::note[Note]
Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/analytics/network-analytics/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,6 @@ For a technical deep-dive into Network Analytics, refer to our [blog post](https
## Related resources

* [Cloudflare GraphQL API](/analytics/graphql-api/)
* [Cloudflare Logpush](/logs/about/)
* [Cloudflare Logpush](/logs/logpush/)
* [Migrating from Network Analytics v1 to Network Analytics v2](/analytics/graphql-api/migration-guides/network-analytics-v2/)
* [Cloudflare Network Analytics v1](/analytics/network-analytics/reference/network-analytics-v1/) <InlineBadge preset="deprecated" />
2 changes: 1 addition & 1 deletion src/content/docs/bots/concepts/bot-tags.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ Once you [enable bot tags](#enable-bot-tags), you can see more information about

## Enable bot tags

To enable bot tags, include the `BotTags` log field when using our [Logpush service](/logs/about/).
To enable bot tags, include the `BotTags` log field when using our [Logpush service](/logs/logpush/).

## Limitations

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/bots/frequently-asked-questions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ You may also see Managed Challenge due to a triggered [WAF custom rule](/cloudfl

This does not mean that your traffic was blocked. It is the challenge sent to your user to determine whether they are likely human or likely bot.

To understand if the result of the challenge was a success or a failure, you can verify using [Logpush](/logs/about/).
To understand if the result of the challenge was a success or a failure, you can verify using [Logpush](/logs/logpush/).

### Does the WAF run before Super Bot Fight Mode?

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ Build custom dashboards to share this information by specifying an individual cu

## Logpush

[Logpush](/logs/about/) sends metadata from Cloudflare products to your cloud storage destination or SIEM.
[Logpush](/logs/logpush/) sends metadata from Cloudflare products to your cloud storage destination or SIEM.

Using [filters](/logs/reference/filters/), you can send set sample rates (or not include logs altogether) based on filter criteria. This flexibility allows you to maintain selective logs for custom hostnames without massively increasing your log volume.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,7 @@ To manually retrieve logs:
Only available on Enterprise plans.
:::

Cloudflare allows you to send SSH command logs to storage destinations configured in [Logpush](/logs/about/), including third-party destinations. For a list of available data fields, refer to the [SSH logs dataset](/logs/reference/log-fields/account/ssh_logs/).
Cloudflare allows you to send SSH command logs to storage destinations configured in [Logpush](/logs/logpush/), including third-party destinations. For a list of available data fields, refer to the [SSH logs dataset](/logs/reference/log-fields/account/ssh_logs/).

To set up the Logpush job, refer to [Logpush integration](/cloudflare-one/insights/logs/logpush/).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ sidebar:
Only available on Enterprise plans.
:::

With Cloudflare's [Logpush](/logs/about/) service, you can configure the automatic export of Zero Trust logs to third-party storage destinations or to security information and event management (SIEM) tools. Once exported, your team can analyze and audit the data as needed.
With Cloudflare's [Logpush](/logs/logpush/) service, you can configure the automatic export of Zero Trust logs to third-party storage destinations or to security information and event management (SIEM) tools. Once exported, your team can analyze and audit the data as needed.

## Export Zero Trust logs with Logpush

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ Based on your report, DLP's machine learning will adjust its confidence in futur
Only available on Enterprise plans.
:::

Gateway allows you to send copies of entire HTTP requests matched in HTTP Allow and Block policies to storage destinations configured in [Logpush](/logs/about/), including third-party destinations.
Gateway allows you to send copies of entire HTTP requests matched in HTTP Allow and Block policies to storage destinations configured in [Logpush](/logs/logpush/), including third-party destinations.

To set up the DLP Forensic Copy Logpush job:

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/containers/faq.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ retained for 3 days on Free plans and 7 days on Paid plans.

See [Workers Logs Pricing](/workers/observability/logs/workers-logs/#pricing) for details on cost.

If you are an Enterprise user, you are able to export container logs via [Logpush](/logs/about)
If you are an Enterprise user, you are able to export container logs via [Logpush](/logs/logpush/)
to your preferred destination.

## How are container instance locations selected?
Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/data-localization/limitations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ Regional Services does not apply to [Subrequests](/workers/platform/limits/#subr

There are certain limitations and caveats when using Customer Metadata Boundary.

Specifically most of the Zone Analytics & Logs UI Tabs will be showing up as empty, when configuring Customer Metadata Boundary to EU only. It is recommended to use the UI [Security Analytics](/waf/analytics/security-analytics/) instead, or the [HTTP request](/logs/reference/log-fields/zone/http_requests/) logs via [Logpush](/logs/about/).
Specifically most of the Zone Analytics & Logs UI Tabs will be showing up as empty, when configuring Customer Metadata Boundary to EU only. It is recommended to use the UI [Security Analytics](/waf/analytics/security-analytics/) instead, or the [HTTP request](/logs/reference/log-fields/zone/http_requests/) logs via [Logpush](/logs/logpush/).

To configure Customer Metadata Boundary to EU only, you must disable Log Retention for all zones within your account. Log Retention is a legacy feature of [Logpull](/logs/logpull/).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ sequenceDiagram

## Log management

Additionally, customers have the option to configure [Logpush](/logs/about/) to push their Customer Logs to various storage services, SIEMs, and log management providers.
Additionally, customers have the option to configure [Logpush](/logs/logpush/) to push their Customer Logs to various storage services, SIEMs, and log management providers.

## Product specific-behavior

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ To view traffic flagged by L3/4 Adaptive DDoS Protection rules:
</TabItem>
</Tabs>

You may also obtain information about flagged traffic through [Logpush](/logs/about/) or the [GraphQL API](/analytics/graphql-api/).
You may also obtain information about flagged traffic through [Logpush](/logs/logpush/) or the [GraphQL API](/analytics/graphql-api/).

To determine if an adaptive rule fits your traffic in a way that will only mitigate attack traffic and will not cause false positives, review the traffic that is _Logged_ by the adaptive rules.

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/dns/dns-firewall/analytics.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ You can also use the DNS Firewall API [reports endpoint](/api/resources/dns_fire

## Logs

You can [set up Logpush](/logs/about/) to deliver [DNS Firewall logs](/logs/reference/log-fields/account/dns_firewall_logs/) to a storage service, SIEM, or log management provider.
You can [set up Logpush](/logs/logpush/) to deliver [DNS Firewall logs](/logs/reference/log-fields/account/dns_firewall_logs/) to a storage service, SIEM, or log management provider.

### Response reasons

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,4 @@ import { Render } from "~/components"

## Retention

Audit Logs are retained for 18 months before being deleted. Enterprise customers can use [Logpush](/logs/about/) to store Audit Logs for longer periods of time.
Audit Logs are retained for 18 months before being deleted. Enterprise customers can use [Logpush](/logs/logpush/) to store Audit Logs for longer periods of time.
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,6 @@ Refer to Magic Firewall's [best practices](/magic-firewall/best-practices/) for

## Optional

- Enable [Logpush](/logs/about/) to your Security Information and Event Management (SIEM).
- Enable [Logpush](/logs/logpush/) to your Security Information and Event Management (SIEM).
- Enable Magic Firewall's [Intrusion Detection System (IDS)](/magic-firewall/about/ids/). Requires Logpush and is only available for accounts with [Advanced Magic Firewall](/magic-firewall/plans/#advanced-features).
- Use [Magic Network Monitoring](/magic-network-monitoring/) for visibility into traffic on your non-Magic Transit prefixes, using NetFlow or sFlow from your CPEs.
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ If needed, you can [restore the original visitor's IP address](/support/troubles

## Cloudflare Logs

Enterprise customers can set up [Logpush](/logs/about/) jobs to regularly send Cloudflare logs to the <GlossaryTooltip term="SIEM">SIEM system</GlossaryTooltip> of their choice.
Enterprise customers can set up [Logpush](/logs/logpush/) jobs to regularly send Cloudflare logs to the <GlossaryTooltip term="SIEM">SIEM system</GlossaryTooltip> of their choice.

This data can help when looking at long-term DDoS attack trends or when you need custom visualizations.

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/logs/logpull/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Cloudflare Logpull is a REST API for consuming request logs over HTTP. These log
:::caution


Logpull is considered a legacy feature and we recommend using [Logpush](/logs/about/) or [Logs Engine](/logs/r2-log-retrieval/) instead for better performance and functionality.
Logpull is considered a legacy feature and we recommend using [Logpush](/logs/logpush/) or [Logs Engine](/logs/r2-log-retrieval/) instead for better performance and functionality.


:::
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,4 +54,4 @@ The following is a sample log with default fields:

## Data retention period

You can query for logs starting from 1 minute in the past (relative to the actual time that you make the query) and go back at least 3 days and up to 7 days. For longer durations, we recommend using [Logpush](/logs/about/).
You can query for logs starting from 1 minute in the past (relative to the actual time that you make the query) and go back at least 3 days and up to 7 days. For longer durations, we recommend using [Logpush](/logs/logpush/).
4 changes: 2 additions & 2 deletions src/content/docs/page-shield/policies/violations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Only available to Enterprise customers with a paid add-on.

Shortly after you configure policies (or content security rules), the Cloudflare dashboard will start displaying any violations of those policies. This information will be available for policies with any [action](/page-shield/policies/#policy-actions) (_Allow_ and _Log_).

Information about policy violations is also available via [GraphQL API](/analytics/graphql-api/) and [Logpush](/logs/about/).
Information about policy violations is also available via [GraphQL API](/analytics/graphql-api/) and [Logpush](/logs/logpush/).

## Review policy violations in the dashboard

Expand Down Expand Up @@ -128,7 +128,7 @@ https://api.cloudflare.com/client/v4/graphql \

## Get policy violations via Logpush

[Cloudflare Logpush](/logs/about/) supports pushing logs to storage services, <GlossaryTooltip term="SIEM">SIEM systems</GlossaryTooltip>, and log management providers.
[Cloudflare Logpush](/logs/logpush/) supports pushing logs to storage services, <GlossaryTooltip term="SIEM">SIEM systems</GlossaryTooltip>, and log management providers.

Information about policy violations is available in the [`page_shield_events` dataset](/logs/reference/log-fields/zone/page_shield_events/).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ With [Workers AI](/workers-ai/), developers can run popular open-source models f

### 9. Cloudflare Observability

Send logs from all services with [Logpush](/logs/about/), gather insights with [Workers Logs](/workers/observability/logs/) directly in the Cloudflare dashboard, collect custom metrics from Workers using [Workers Analytics Engine](/analytics/analytics-engine/), or observe and control AI applications with [AI Gateway](/ai-gateway/).
Send logs from all services with [Logpush](/logs/logpush/), gather insights with [Workers Logs](/workers/observability/logs/) directly in the Cloudflare dashboard, collect custom metrics from Workers using [Workers Analytics Engine](/analytics/analytics-engine/), or observe and control AI applications with [AI Gateway](/ai-gateway/).

### 10. External Logs & Analytics

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/rules/transform/troubleshooting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,6 @@ For more information on runtime errors related to Transform Rules configuration,

Transform Rules performing request header modifications affect the HTTP headers sent by Cloudflare's network to your origin server. You will not find these headers in your browser request or response data, which can make it difficult to tell if the rule is working as intended.

To check if a request header transform rule is taking effect, you can check the logs on your origin server or use [Cloudflare Trace](/rules/trace-request/) to check that the rule is matching traffic correctly. Since [Cloudflare Logpush](/logs/about/) only logs original HTTP request/response headers, Logpush logs will not include any header transformations done via Transform Rules.
To check if a request header transform rule is taking effect, you can check the logs on your origin server or use [Cloudflare Trace](/rules/trace-request/) to check that the rule is matching traffic correctly. Since [Cloudflare Logpush](/logs/logpush/) only logs original HTTP request/response headers, Logpush logs will not include any header transformations done via Transform Rules.

To add HTTP headers that website visitors will receive in their browsers, you must [modify the response headers](/rules/transform/response-header-modification/) instead.
2 changes: 1 addition & 1 deletion src/content/docs/style-guide/how-we-docs/redirects.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ We have some automation to help [flag needed redirects](#potential-redirects).

Another time to add redirects is when you see a lot of `404` response codes on certain paths of your docs site. These `404` responses might be due to a missing redirect or mistyped link.

We identify these status codes either through our [Cloudflare analytics](/analytics/account-and-zone-analytics/zone-analytics/) (ad hoc) or [Logpush job](/logs/about/) (more thorough, quarterly).
We identify these status codes either through our [Cloudflare analytics](/analytics/account-and-zone-analytics/zone-analytics/) (ad hoc) or [Logpush job](/logs/logpush/) (more thorough, quarterly).

---

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ sidebar:

import { GlossaryTooltip, RuleID } from "~/components";

You can include the encrypted matched payload in your [Logpush](/logs/about/) jobs by adding the **General** > [**Metadata**](/logs/reference/log-fields/zone/firewall_events/#metadata) field from the Firewall Events dataset to your job.
You can include the encrypted matched payload in your [Logpush](/logs/logpush/) jobs by adding the **General** > [**Metadata**](/logs/reference/log-fields/zone/firewall_events/#metadata) field from the Firewall Events dataset to your job.

The payload, in its encrypted form, is available in the [`encrypted_matched_data` property](#structure-of-encrypted_matched_data-property-in-logpush) of the `Metadata` field.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ To view the content of the payload in clear text, do one of the following:

- Decrypt the payload in the command line using the `matched-data-cli` tool. Refer to [Decrypt the payload content in the command line](/waf/managed-rules/payload-logging/command-line/decrypt-payload/) for details.

- Decrypt the matched payload in your [Logpush](/logs/about/) job using a Worker before storing the logs in your <GlossaryTooltip term="SIEM">SIEM system</GlossaryTooltip>. Refer to [Store decrypted matched payloads in logs](/waf/managed-rules/payload-logging/decrypt-in-logs/) for details.
- Decrypt the matched payload in your [Logpush](/logs/logpush/) job using a Worker before storing the logs in your <GlossaryTooltip term="SIEM">SIEM system</GlossaryTooltip>. Refer to [Store decrypted matched payloads in logs](/waf/managed-rules/payload-logging/decrypt-in-logs/) for details.

:::caution[Important]

Expand Down
2 changes: 1 addition & 1 deletion src/content/docs/workers/observability/logs/logpush.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ sidebar:

import { WranglerConfig } from "~/components";

[Cloudflare Logpush](/logs/about/) supports the ability to send [Workers Trace Event Logs](/logs/reference/log-fields/account/workers_trace_events/) to a [supported destination](/logs/get-started/enable-destinations/). Worker’s Trace Events Logpush includes metadata about requests and responses, unstructured `console.log()` messages and any uncaught exceptions. This product is available on the Workers Paid plan. For pricing information, refer to [Pricing](/workers/platform/pricing/#workers-trace-events-logpush).
[Cloudflare Logpush](/logs/logpush/) supports the ability to send [Workers Trace Event Logs](/logs/reference/log-fields/account/workers_trace_events/) to a [supported destination](/logs/get-started/enable-destinations/). Worker’s Trace Events Logpush includes metadata about requests and responses, unstructured `console.log()` messages and any uncaught exceptions. This product is available on the Workers Paid plan. For pricing information, refer to [Pricing](/workers/platform/pricing/#workers-trace-events-logpush).

:::caution

Expand Down
Loading
Loading