diff --git a/public/__redirects b/public/__redirects
index 214e006aeb991f..052b8d75117e80 100644
--- a/public/__redirects
+++ b/public/__redirects
@@ -569,7 +569,7 @@
/fundamentals/speed/aim/ /speed/aim/ 301
/fundamentals/speed/optimization/ /speed/optimization/ 301
/fundamentals/speed/prefetch-urls/ /speed/optimization/content/prefetch-urls/ 301
-/fundamentals/data-products/analytics-integrations/sumo-logic/ /logs/get-started/enable-destinations/sumo-logic/ 301
+/fundamentals/data-products/analytics-integrations/sumo-logic/ /logs/logpush/logpush-job/enable-destinations/sumo-logic/ 301
/support/account-management-billing/account-management/adding-multiple-sites-to-cloudflare-via-automation/ /fundamentals/manage-domains/add-multiple-sites-automation/ 301
/support/account-management-billing/account-privacy-and-security/securing-user-access-with-two-factor-authentication-2fa/ /fundamentals/user-profiles/2fa/ 301
/support/account-management-billing/account-privacy-and-security/multi-factor-email-authentication/ /fundamentals/user-profiles/multi-factor-email-authentication/ 301
@@ -957,13 +957,13 @@
/logs/log-fields/ /logs/reference/log-fields/ 301
/logs/logpull-api/ /logs/logpull/ 301
/logs/logpull-api/requesting-logs/ /logs/logpull/requesting-logs/ 301
-/logs/logpush/aws-s3/ /logs/get-started/enable-destinations/aws-s3/ 301
-/logs/logpush/azure/ /logs/get-started/enable-destinations/azure/ 301
-/logs/logpush/google-cloud-storage/ /logs/get-started/enable-destinations/google-cloud-storage/ 301
-/logs/logpush/logpush-configuration-api/ /logs/get-started/enable-destinations/ 301
+/logs/logpush/aws-s3/ /logs/logpush/logpush-job/enable-destinations/aws-s3/ 301
+/logs/logpush/azure/ /logs/logpush/logpush-job/enable-destinations/azure/ 301
+/logs/logpush/google-cloud-storage/ /logs/logpush/logpush-job/enable-destinations/google-cloud-storage/ 301
+/logs/logpush/logpush-configuration-api/ /logs/logpush/logpush-job/enable-destinations/ 301
/logs/logpush/logpush-configuration-api/understanding-logpush-api/ /logs/get-started/api-configuration/ 301
-/logs/logpush/logpush-dashboard/ /logs/get-started/enable-destinations/ 301
-/logs/logpush/s3-compatible-endpoints/ /logs/get-started/enable-destinations/s3-compatible-endpoints/ 301
+/logs/logpush/logpush-dashboard/ /logs/logpush/logpush-job/enable-destinations/ 301
+/logs/logpush/s3-compatible-endpoints/ /logs/logpush/logpush-job/enable-destinations/s3-compatible-endpoints/ 301
/logs/reference/logpush-api-configuration/ /logs/get-started/api-configuration/ 301
/logs/reference/logpush-api-configuration/filters/ /logs/reference/filters/ 301
# Non-slashed version is being used in the Cloudflare dashboard
@@ -2177,6 +2177,7 @@
/fundamentals/setup/account/* /fundamentals/account/:splat 301
/fundamentals/setup/manage-domains/* /fundamentals/manage-domains/:splat 301
/fundamentals/setup/manage-members/* /fundamentals/manage-members/:splat 301
+/logs/get-started/enable-destinations/* /logs/logpush/logpush-job/enable-destinations/:splat 301
# Cloudflare One / Zero Trust
/cloudflare-one/connections/connect-networks/install-and-setup/tunnel-guide/local/as-a-service/* /cloudflare-one/connections/connect-networks/configure-tunnels/local-management/as-a-service/:splat 301
diff --git a/src/content/changelog/logs/2025-03-06-oneclick-logpush.mdx b/src/content/changelog/logs/2025-03-06-oneclick-logpush.mdx
index 2a8d203bb7d9c3..cda9cdfafce7a0 100644
--- a/src/content/changelog/logs/2025-03-06-oneclick-logpush.mdx
+++ b/src/content/changelog/logs/2025-03-06-oneclick-logpush.mdx
@@ -10,6 +10,6 @@ Now, you no longer need to navigate multiple pages to manually create an R2 buck
This enhancement makes it easier for customers to adopt Logpush and R2.
-For more details refer to our [Logs](/logs/get-started/enable-destinations/r2/) documentation.
+For more details refer to our [Logs](/logs/logpush/logpush-job/enable-destinations/r2/) documentation.

diff --git a/src/content/docs/analytics/analytics-integrations/new-relic.mdx b/src/content/docs/analytics/analytics-integrations/new-relic.mdx
index ddc75f6d7b889d..5f27dc48e6b1d5 100644
--- a/src/content/docs/analytics/analytics-integrations/new-relic.mdx
+++ b/src/content/docs/analytics/analytics-integrations/new-relic.mdx
@@ -13,7 +13,7 @@ Before sending your Cloudflare log data to New Relic, make sure that you:
- Have a Cloudflare Enterprise account with Cloudflare Logs enabled.
- Have a New Relic account.
-- Configure [Logpush to New Relic](/logs/get-started/enable-destinations/new-relic/).
+- Configure [Logpush to New Relic](/logs/logpush/logpush-job/enable-destinations/new-relic/).
## Task 1 - Install the Cloudflare Network Logs quickstart
diff --git a/src/content/docs/analytics/analytics-integrations/sentinel.mdx b/src/content/docs/analytics/analytics-integrations/sentinel.mdx
index 7f851a0992c485..01558bac718ba6 100644
--- a/src/content/docs/analytics/analytics-integrations/sentinel.mdx
+++ b/src/content/docs/analytics/analytics-integrations/sentinel.mdx
@@ -10,7 +10,7 @@ Microsoft has developed a Cloudflare connector that allows their customers to in
## How it works
-[Logpush](/logs/get-started/enable-destinations/azure/) sends logs from Cloudflare to Azure Blob Storage. From there, the Cloudflare connector, a Microsoft function, ingests these logs into Azure Log Analytics Workspace, making them available for monitoring and analysis in Microsoft Sentinel.
+[Logpush](/logs/logpush/logpush-job/enable-destinations/azure/) sends logs from Cloudflare to Azure Blob Storage. From there, the Cloudflare connector, a Microsoft function, ingests these logs into Azure Log Analytics Workspace, making them available for monitoring and analysis in Microsoft Sentinel.

diff --git a/src/content/docs/analytics/analytics-integrations/splunk.mdx b/src/content/docs/analytics/analytics-integrations/splunk.mdx
index 6c82f03ca676fa..25d02d389ceb38 100644
--- a/src/content/docs/analytics/analytics-integrations/splunk.mdx
+++ b/src/content/docs/analytics/analytics-integrations/splunk.mdx
@@ -86,9 +86,9 @@ You can also manually configure Data Models by going to **Settings** > **Data mo
## Task 2 - Make the API call to create the Logpush job
-Create the Logpush job by following the instructions on [Enable Logpush to Splunk](/logs/get-started/enable-destinations/splunk/). The API call creates a Logpush job but does not enable it.
+Create the Logpush job by following the instructions on [Enable Logpush to Splunk](/logs/logpush/logpush-job/enable-destinations/splunk/). The API call creates a Logpush job but does not enable it.
-Enable the Logpush job through the Cloudflare dashboard or through the API by following the instructions on [Enable Logpush to Splunk](/logs/get-started/enable-destinations/splunk/). To enable through the dashboard:
+Enable the Logpush job through the Cloudflare dashboard or through the API by following the instructions on [Enable Logpush to Splunk](/logs/logpush/logpush-job/enable-destinations/splunk/). To enable through the dashboard:
1. Navigate to the Cloudflare dashboard and select **Analytics & Logs** > **Logs**.
2. Select **Edit** and select the fields referenced in the Dashboard section below to fully populate all tables and graphs.
diff --git a/src/content/docs/analytics/network-analytics/get-started.mdx b/src/content/docs/analytics/network-analytics/get-started.mdx
index 29525fefd23cd0..80167ff9631d6f 100644
--- a/src/content/docs/analytics/network-analytics/get-started.mdx
+++ b/src/content/docs/analytics/network-analytics/get-started.mdx
@@ -28,7 +28,7 @@ Use the [GraphQL Analytics API](/analytics/graphql-api/) to query data using the
## Send Network Analytics logs to a third-party service
-[Create a Logpush job](/logs/get-started/enable-destinations/) that sends Network analytics logs to your storage service, SIEM solution, or log management provider.
+[Create a Logpush job](/logs/logpush/logpush-job/enable-destinations/) that sends Network analytics logs to your storage service, SIEM solution, or log management provider.
## Limitations
diff --git a/src/content/docs/cloudflare-one/insights/logs/enable-logs.mdx b/src/content/docs/cloudflare-one/insights/logs/enable-logs.mdx
index e0b04948b80e3a..766384c98cdd36 100644
--- a/src/content/docs/cloudflare-one/insights/logs/enable-logs.mdx
+++ b/src/content/docs/cloudflare-one/insights/logs/enable-logs.mdx
@@ -11,7 +11,7 @@ Email Security allows you to configure Logpush to send detection data to an endp
Detection logs generate logs made by Email Security and some of the metadata associated with the detection.
-To enable detection logs, refer to [Enable destinations](/logs/get-started/enable-destinations/).
+To enable detection logs, refer to [Enable destinations](/logs/logpush/logpush-job/enable-destinations/).
If you enable detection logs using [R2](/r2/), choose **Email security alerts** when configuring the **Dataset**.
@@ -19,7 +19,7 @@ If you enable detection logs using [R2](/r2/), choose **Email security alerts**
User action logs allow you to view logs regarding all actions taken via the [API](/api/resources/email_security/) or the dashboard.
-Before you can enable audit logs for Email Security, you will have to enable logpush jobs to your storage destination. Refer to [Enable destinations](/logs/get-started/enable-destinations/) to enable logs on destinations such as Cloudflare R2, HTTP, Amazon S3, and more.
+Before you can enable audit logs for Email Security, you will have to enable logpush jobs to your storage destination. Refer to [Enable destinations](/logs/logpush/logpush-job/enable-destinations/) to enable logs on destinations such as Cloudflare R2, HTTP, Amazon S3, and more.
Once you have configured your destination, you can set up audit logs for user action:
diff --git a/src/content/docs/cloudflare-one/insights/logs/logpush.mdx b/src/content/docs/cloudflare-one/insights/logs/logpush.mdx
index 8d364502034815..f68fd85035c7a0 100644
--- a/src/content/docs/cloudflare-one/insights/logs/logpush.mdx
+++ b/src/content/docs/cloudflare-one/insights/logs/logpush.mdx
@@ -14,7 +14,7 @@ With Cloudflare's [Logpush](/logs/logpush/) service, you can configure the autom
## Export Zero Trust logs with Logpush
:::caution[Dashboard limitation]
-Zero Trust does not support configuring [Cloudflare R2](/logs/get-started/enable-destinations/r2/) as a Logpush destination in the dashboard. To use R2 as a destination for Zero Trust logs, configure your Logpush jobs [with the API](/logs/get-started/enable-destinations/r2/#manage-via-api).
+Zero Trust does not support configuring [Cloudflare R2](/logs/logpush/logpush-job/enable-destinations/r2/) as a Logpush destination in the dashboard. To use R2 as a destination for Zero Trust logs, configure your Logpush jobs [with the API](/logs/logpush/logpush-job/enable-destinations/r2/#manage-via-api).
:::
To configure Logpush for Zero Trust logs:
@@ -22,7 +22,7 @@ To configure Logpush for Zero Trust logs:
1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **Logs** > **Logpush**.
2. If this is your first Logpush job, select **Add a Logpush job**. Otherwise, select **Go to logpush configurations**.
3. In Logpush, select **Create a Logpush job**.
-4. Choose a [Logpush destination](/logs/get-started/enable-destinations/).
+4. Choose a [Logpush destination](/logs/logpush/logpush-job/enable-destinations/).
5. Follow the service-specific instructions to configure and validate your destination.
6. Choose the [Zero Trust datasets](#zero-trust-datasets) to export.
7. Enter a **Job name**, any [filters](/logs/reference/filters/) you would like to add, and the data fields you want to include in the logs.
diff --git a/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx b/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx
index 2bddf4c48d2a53..8845381178da60 100644
--- a/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx
+++ b/src/content/docs/cloudflare-one/policies/data-loss-prevention/dlp-policies/logging-options.mdx
@@ -82,7 +82,7 @@ To set up the DLP Forensic Copy Logpush job:
1. In [Zero Trust](https://one.dash.cloudflare.com/), go to **Logs** > **Logpush**.
2. If this is your first Logpush job, select **Add a Logpush job**. Otherwise, select **Go to logpush configurations**.
3. In Logpush, select **Create a Logpush job**.
-4. Choose a [Logpush destination](/logs/get-started/enable-destinations/).
+4. Choose a [Logpush destination](/logs/logpush/logpush-job/enable-destinations/).
5. In **Configure logpush job**, choose the _DLP forensic copies_ dataset. Select **Create Logpush job**.
6. Return to Zero Trust and go to **Gateway** > **Firewall policies** > **HTTP**.
7. Edit an existing Allow or Block policy, or [create a new policy](/cloudflare-one/policies/data-loss-prevention/dlp-policies/#2-create-a-dlp-policy). Your policy does not need to include a DLP profile.
diff --git a/src/content/docs/data-localization/how-to/r2.mdx b/src/content/docs/data-localization/how-to/r2.mdx
index 2eb543f1f94bb3..c02906c31cd263 100644
--- a/src/content/docs/data-localization/how-to/r2.mdx
+++ b/src/content/docs/data-localization/how-to/r2.mdx
@@ -79,7 +79,7 @@ This command will output a hash similar to `dxxxx391b`.
-3. Set up a Logpush destination using [S3-compatible endpoint](/logs/get-started/enable-destinations/s3-compatible-endpoints/) and fill in the following fields:
+3. Set up a Logpush destination using [S3-compatible endpoint](/logs/logpush/logpush-job/enable-destinations/s3-compatible-endpoints/) and fill in the following fields:
- **Bucket**: Enter the name of the R2 bucket you created with the jurisdiction you would like to use.
- **Path** (optional): If you want, you can specify a folder path to organize your logs.
diff --git a/src/content/docs/logs/R2-log-retrieval.mdx b/src/content/docs/logs/R2-log-retrieval.mdx
index 1b3a6e65673333..e7b8d612a259a0 100644
--- a/src/content/docs/logs/R2-log-retrieval.mdx
+++ b/src/content/docs/logs/R2-log-retrieval.mdx
@@ -16,7 +16,7 @@ Logs Engine is going to be replaced by Log Explorer. For further details, consul
## Store logs in R2
-- Set up a [Logpush to R2](/logs/get-started/enable-destinations/r2/) job.
+- Set up a [Logpush to R2](/logs/logpush/logpush-job/enable-destinations/r2/) job.
- Create an [R2 access key](/r2/api/tokens/) with at least R2 read permissions.
- Ensure that you have Logshare read permissions.
- Alternatively, create a Cloudflare API token with the following permissions:
diff --git a/src/content/docs/logs/get-started/index.mdx b/src/content/docs/logs/get-started/index.mdx
index 221566aca666cb..13f67ab993be7e 100644
--- a/src/content/docs/logs/get-started/index.mdx
+++ b/src/content/docs/logs/get-started/index.mdx
@@ -12,4 +12,4 @@ Cloudflare Logpush supports pushing logs to storage services, SIEMs, and log man
Cloudflare aims to support additional services in the future. Interested in a particular service? Take this [survey](https://goo.gl/forms/0KpMfae63WMPjBmD2).
- Enable destinations
+ Enable destinations
diff --git a/src/content/docs/logs/get-started/enable-destinations/aws-s3.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/aws-s3.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/aws-s3.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/aws-s3.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/azure.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/azure.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/azure.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/azure.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/bigquery.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/bigquery.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/bigquery.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/bigquery.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/datadog.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/datadog.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/datadog.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/datadog.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/elastic.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/elastic.mdx
similarity index 85%
rename from src/content/docs/logs/get-started/enable-destinations/elastic.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/elastic.mdx
index a96f8f68a4c230..81e8ce127523bd 100644
--- a/src/content/docs/logs/get-started/enable-destinations/elastic.mdx
+++ b/src/content/docs/logs/logpush/logpush-job/enable-destinations/elastic.mdx
@@ -33,9 +33,9 @@ Determine which method you want to use, and configure the appropriate Logpush jo
Elastic supports the default JSON format.
-To push logs to an object storage for short term storage and buffering before ingesting into Elastic (recommended), follow the instructions to configure a Logpush job to push logs to [AWS S3](/logs/get-started/enable-destinations/aws-s3/), [Google Cloud Storage](/logs/get-started/enable-destinations/google-cloud-storage/), or [Azure Blob Storage](/logs/get-started/enable-destinations/azure/).
+To push logs to an object storage for short term storage and buffering before ingesting into Elastic (recommended), follow the instructions to configure a Logpush job to push logs to [AWS S3](/logs/logpush/logpush-job/enable-destinations/aws-s3/), [Google Cloud Storage](/logs/logpush/logpush-job/enable-destinations/google-cloud-storage/), or [Azure Blob Storage](/logs/logpush/logpush-job/enable-destinations/azure/).
-To use the [HTTP Endpoint mode](/logs/get-started/enable-destinations/http/), use the API to push logs to an HTTP endpoint backed by your Elastic Agent.
+To use the [HTTP Endpoint mode](/logs/logpush/logpush-job/enable-destinations/http/), use the API to push logs to an HTTP endpoint backed by your Elastic Agent.
Add the same custom header along with its value on both sides for additional security.
diff --git a/src/content/docs/logs/get-started/enable-destinations/google-cloud-storage.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/google-cloud-storage.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/google-cloud-storage.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/google-cloud-storage.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/http.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/http.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/http.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/http.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/ibm-cloud-logs.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/ibm-cloud-logs.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/ibm-cloud-logs.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/ibm-cloud-logs.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/ibm-qradar.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/ibm-qradar.mdx
similarity index 88%
rename from src/content/docs/logs/get-started/enable-destinations/ibm-qradar.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/ibm-qradar.mdx
index d3367c43159e71..34234f4c734e98 100644
--- a/src/content/docs/logs/get-started/enable-destinations/ibm-qradar.mdx
+++ b/src/content/docs/logs/logpush/logpush-job/enable-destinations/ibm-qradar.mdx
@@ -9,12 +9,12 @@ import { APIRequest } from "~/components"
To configure a QRadar/Cloudflare integration you have the option to use one of the following methods:
-- [HTTP Receiver protocol](/logs/get-started/enable-destinations/ibm-qradar/#http-receiver-protocol)
-- [Amazon AWS S3 Rest API](/logs/get-started/enable-destinations/ibm-qradar/#amazon-aws-s3-rest-api)
+- [HTTP Receiver protocol](/logs/logpush/logpush-job/enable-destinations/ibm-qradar/#http-receiver-protocol)
+- [Amazon AWS S3 Rest API](/logs/logpush/logpush-job/enable-destinations/ibm-qradar/#amazon-aws-s3-rest-api)
## HTTP Receiver Protocol
-To send Cloudflare logs to QRadar you need to create a [Logpush job to HTTP endpoints](/logs/get-started/enable-destinations/http/) via API. Below you can find two curl examples of how to send Cloudflare Firewall events and Cloudflare HTTP events to QRadar.
+To send Cloudflare logs to QRadar you need to create a [Logpush job to HTTP endpoints](/logs/logpush/logpush-job/enable-destinations/http/) via API. Below you can find two curl examples of how to send Cloudflare Firewall events and Cloudflare HTTP events to QRadar.
### Cloudflare Firewall events
@@ -79,7 +79,7 @@ Cloudflare checks the accessibility of the IP address, port, and validates the c
When you use the Amazon S3 REST API protocol, IBM QRadar collects Cloudflare Log events from an Amazon S3 bucket. To use this option, you need to:
1. Create an [Amazon S3 bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html) to store your Cloudflare Logs. Make a note of the bucket name and the AWS access key ID and secret access key with sufficient permissions to write to the bucket.
-2. [Enable a Logpush to Amazon S3](/logs/get-started/enable-destinations/aws-s3/).
+2. [Enable a Logpush to Amazon S3](/logs/logpush/logpush-job/enable-destinations/aws-s3/).
3. In the AWS Management Console, go to the Amazon S3 service. Create a bucket endpoint to allow Cloudflare to send logs directly to the S3 bucket.
4. Follow the steps in [Integrate Cloudflare Logs with QRadar by using the Amazon AWS S3 REST API protocol](https://www.ibm.com/docs/en/dsm?topic=configuration-cloudflare-logs).
5. Test the configuration by generating some logs in Cloudflare and ensuring that they are delivered to the S3 bucket and subsequently forwarded to QRadar.
diff --git a/src/content/docs/logs/get-started/enable-destinations/index.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/index.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/index.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/index.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/new-relic.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/new-relic.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/new-relic.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/new-relic.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/other-providers.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/other-providers.mdx
similarity index 79%
rename from src/content/docs/logs/get-started/enable-destinations/other-providers.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/other-providers.mdx
index ae18386db4de1f..f78a5f64f45d92 100644
--- a/src/content/docs/logs/get-started/enable-destinations/other-providers.mdx
+++ b/src/content/docs/logs/logpush/logpush-job/enable-destinations/other-providers.mdx
@@ -14,7 +14,7 @@ Cloudflare Logpush supports pushing logs to a limited set of services providers.
## Manage via the Cloudflare dashboard
-Refer to [Enable destinations](/logs/get-started/enable-destinations/) for the list of services you can configure to use with Logpush through the Cloudflare dashboard. Interested in a different service? Take this [survey](https://docs.google.com/forms/d/e/1FAIpQLScwOSabROywVajpMX2ZYCVl3saYs11cP4NIC8QR-wmOAnxOtA/viewform).
+Refer to [Enable destinations](/logs/logpush/logpush-job/enable-destinations/) for the list of services you can configure to use with Logpush through the Cloudflare dashboard. Interested in a different service? Take this [survey](https://docs.google.com/forms/d/e/1FAIpQLScwOSabROywVajpMX2ZYCVl3saYs11cP4NIC8QR-wmOAnxOtA/viewform).
## Manage via API
diff --git a/src/content/docs/logs/get-started/enable-destinations/r2.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/r2.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/r2.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/r2.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/s3-compatible-endpoints.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/s3-compatible-endpoints.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/s3-compatible-endpoints.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/s3-compatible-endpoints.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/splunk.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/splunk.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/splunk.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/splunk.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/sumo-logic.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/sumo-logic.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/sumo-logic.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/sumo-logic.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/third-party/axiom.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/third-party/axiom.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/third-party/axiom.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/third-party/axiom.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/third-party/index.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/third-party/index.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/third-party/index.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/third-party/index.mdx
diff --git a/src/content/docs/logs/get-started/enable-destinations/third-party/taegis.mdx b/src/content/docs/logs/logpush/logpush-job/enable-destinations/third-party/taegis.mdx
similarity index 100%
rename from src/content/docs/logs/get-started/enable-destinations/third-party/taegis.mdx
rename to src/content/docs/logs/logpush/logpush-job/enable-destinations/third-party/taegis.mdx
diff --git a/src/content/docs/logs/logpush/logpush-job/index.mdx b/src/content/docs/logs/logpush/logpush-job/index.mdx
new file mode 100644
index 00000000000000..53cc050871ceb1
--- /dev/null
+++ b/src/content/docs/logs/logpush/logpush-job/index.mdx
@@ -0,0 +1,7 @@
+---
+pcx_content_type: how-tp
+title: Logpush job setup
+sidebar:
+ order: 1
+
+---
\ No newline at end of file
diff --git a/src/content/docs/magic-firewall/packet-captures/pcaps-bucket-setup.mdx b/src/content/docs/magic-firewall/packet-captures/pcaps-bucket-setup.mdx
index 85e21e2b2ce956..d55e6df827607d 100644
--- a/src/content/docs/magic-firewall/packet-captures/pcaps-bucket-setup.mdx
+++ b/src/content/docs/magic-firewall/packet-captures/pcaps-bucket-setup.mdx
@@ -31,7 +31,7 @@ The **Prove ownership** step of the **Bucket configuration** displays.
Before you can begin using a bucket, you must first enable destinations.
-Refer to the [Amazon S3](/logs/get-started/enable-destinations/aws-s3/#create-and-get-access-to-an-s3-bucket) or [Google Cloud Storage](/logs/get-started/enable-destinations/google-cloud-storage/#create-and-get-access-to-a-gcs-bucket) documentation and follow the steps for those specific services.
+Refer to the [Amazon S3](/logs/logpush/logpush-job/enable-destinations/aws-s3/#create-and-get-access-to-an-s3-bucket) or [Google Cloud Storage](/logs/logpush/logpush-job/enable-destinations/google-cloud-storage/#create-and-get-access-to-a-gcs-bucket) documentation and follow the steps for those specific services.
diff --git a/src/content/docs/r2/reference/data-location.mdx b/src/content/docs/r2/reference/data-location.mdx
index 78b196c4906a8f..b0c84b3dab294d 100644
--- a/src/content/docs/r2/reference/data-location.mdx
+++ b/src/content/docs/r2/reference/data-location.mdx
@@ -142,7 +142,7 @@ Cloudflare Enterprise customers may contact their account team or [Cloudflare Su
The following services do not interact with R2 resources with assigned jurisdictions:
- [Super Slurper](/r2/data-migration/) (_coming soon_)
-- [Logpush](/logs/get-started/enable-destinations/r2/). As a workaround to this limitation, you can set up a [Logpush job using an S3-compatible endpoint](/data-localization/how-to/r2/#send-logs-to-r2-via-s3-compatible-endpoint) to store logs in an R2 bucket in the jurisdiction of your choice.
+- [Logpush](/logs/logpush/logpush-job/enable-destinations/r2/). As a workaround to this limitation, you can set up a [Logpush job using an S3-compatible endpoint](/data-localization/how-to/r2/#send-logs-to-r2-via-s3-compatible-endpoint) to store logs in an R2 bucket in the jurisdiction of your choice.
### Additional considerations
diff --git a/src/content/docs/reference-architecture/diagrams/serverless/fullstack-application.mdx b/src/content/docs/reference-architecture/diagrams/serverless/fullstack-application.mdx
index 6006bd1e520454..129c9ac266a071 100644
--- a/src/content/docs/reference-architecture/diagrams/serverless/fullstack-application.mdx
+++ b/src/content/docs/reference-architecture/diagrams/serverless/fullstack-application.mdx
@@ -91,7 +91,7 @@ Send logs from all services with [Logpush](/logs/logpush/), gather insights with
### 10. External Logs & Analytics
-Integrate Cloudflare's observability solutions with your existing third-party solutions. Logpush supports many [destinations](/logs/get-started/enable-destinations/) to push logs to for storage and further analysis. Also, Cloudflare analytics can be [integrated with analytics solutions](/analytics/analytics-integrations/). The [GraphQL Analytics API](/analytics/graphql-api/) allows for flexible queries and integrations.
+Integrate Cloudflare's observability solutions with your existing third-party solutions. Logpush supports many [destinations](/logs/logpush/logpush-job/enable-destinations/) to push logs to for storage and further analysis. Also, Cloudflare analytics can be [integrated with analytics solutions](/analytics/analytics-integrations/). The [GraphQL Analytics API](/analytics/graphql-api/) allows for flexible queries and integrations.
### 11. Tooling & Provisioning
diff --git a/src/content/docs/spectrum/reference/logs.mdx b/src/content/docs/spectrum/reference/logs.mdx
index c420a65131c4dc..36bf07d2cc7d7b 100644
--- a/src/content/docs/spectrum/reference/logs.mdx
+++ b/src/content/docs/spectrum/reference/logs.mdx
@@ -11,7 +11,7 @@ For each connection, Spectrum logs a connect event and either a disconnect or er
## Configure Logpush
-Spectrum [log events](/logs/reference/log-fields/) can be configured through the dashboard or API, depending on your preferred [destination](/logs/get-started/enable-destinations/).
+Spectrum [log events](/logs/reference/log-fields/) can be configured through the dashboard or API, depending on your preferred [destination](/logs/logpush/logpush-job/enable-destinations/).
## Status Codes
diff --git a/src/content/docs/style-guide/documentation-content-strategy/content-types/3rd-party-integration-guide.mdx b/src/content/docs/style-guide/documentation-content-strategy/content-types/3rd-party-integration-guide.mdx
index afa03502446769..7aae6775e58d80 100644
--- a/src/content/docs/style-guide/documentation-content-strategy/content-types/3rd-party-integration-guide.mdx
+++ b/src/content/docs/style-guide/documentation-content-strategy/content-types/3rd-party-integration-guide.mdx
@@ -141,7 +141,7 @@ Prerequisites
**3rd-party integration in the Cloudflare dashboard**:
-- [Enable Logpush to Sumo Logic](/logs/get-started/enable-destinations/sumo-logic/)
+- [Enable Logpush to Sumo Logic](/logs/logpush/logpush-job/enable-destinations/sumo-logic/)
- [Device Posture - Carbon Black](/cloudflare-one/identity/devices/warp-client-checks/carbon-black/)
**Linking to external documentation**:
diff --git a/src/content/docs/style-guide/documentation-content-strategy/content-types/navigation.mdx b/src/content/docs/style-guide/documentation-content-strategy/content-types/navigation.mdx
index 4f8318eeff9511..e1c836bc6e6c72 100644
--- a/src/content/docs/style-guide/documentation-content-strategy/content-types/navigation.mdx
+++ b/src/content/docs/style-guide/documentation-content-strategy/content-types/navigation.mdx
@@ -39,6 +39,6 @@ import { DirectoryListing } from "~/components"
## Examples
-[Logs: Enable destinations](/logs/get-started/enable-destinations/)
+[Logs: Enable destinations](/logs/logpush/logpush-job/enable-destinations/)
[Cloudflare Tunnel: Get Started](/cloudflare-one/connections/connect-networks/get-started/)
diff --git a/src/content/docs/workers/observability/logs/logpush.mdx b/src/content/docs/workers/observability/logs/logpush.mdx
index 8c9534aede9979..07465abeb1bea8 100644
--- a/src/content/docs/workers/observability/logs/logpush.mdx
+++ b/src/content/docs/workers/observability/logs/logpush.mdx
@@ -11,7 +11,7 @@ sidebar:
import { WranglerConfig } from "~/components";
-[Cloudflare Logpush](/logs/logpush/) supports the ability to send [Workers Trace Event Logs](/logs/reference/log-fields/account/workers_trace_events/) to a [supported destination](/logs/get-started/enable-destinations/). Worker’s Trace Events Logpush includes metadata about requests and responses, unstructured `console.log()` messages and any uncaught exceptions. This product is available on the Workers Paid plan. For pricing information, refer to [Pricing](/workers/platform/pricing/#workers-trace-events-logpush).
+[Cloudflare Logpush](/logs/logpush/) supports the ability to send [Workers Trace Event Logs](/logs/reference/log-fields/account/workers_trace_events/) to a [supported destination](/logs/logpush/logpush-job/enable-destinations/). Worker’s Trace Events Logpush includes metadata about requests and responses, unstructured `console.log()` messages and any uncaught exceptions. This product is available on the Workers Paid plan. For pricing information, refer to [Pricing](/workers/platform/pricing/#workers-trace-events-logpush).
:::caution
@@ -51,7 +51,7 @@ To create a Logpush job in the Cloudflare dashboard:
### Via cURL
-The following example sends Workers logs to R2. For more configuration options, refer to [Enable destinations](/logs/get-started/enable-destinations/) and [API configuration](/logs/get-started/api-configuration/) in the Logs documentation.
+The following example sends Workers logs to R2. For more configuration options, refer to [Enable destinations](/logs/logpush/logpush-job/enable-destinations/) and [API configuration](/logs/get-started/api-configuration/) in the Logs documentation.
```bash
curl "https://api.cloudflare.com/client/v4/accounts//logpush/jobs" \