diff --git a/src/content/docs/analytics/dashboards.mdx b/src/content/docs/analytics/dashboards.mdx
deleted file mode 100644
index b24f61722a2b0dc..000000000000000
--- a/src/content/docs/analytics/dashboards.mdx
+++ /dev/null
@@ -1,96 +0,0 @@
----
-pcx_content_type: reference
-title: Custom dashboards
-sidebar:
- order: 9
-
----
-
-Custom dashboards allow you to create tailored dashboards to monitor application security, performance, and usage. You can create monitors for ongoing monitoring of a previous incident, use them to identify indicators of suspicious activity, and access templates to help you get started.
-
-Dashboards provide a visual interface that displays key metrics and analytics, helping you monitor and analyze data efficiently. Different dashboards serve different purposes. For example, a security dashboard tracks attack attempts and threats, a performance dashboard monitors API latency and uptime, and a usage dashboard analyzes traffic patterns and user behavior.
-
-Different metrics serve distinct roles in providing insights into your application's performance. For example, total HTTP requests offers an overview of traffic volume, while average response time helps assess application speed. Additionally, usage metrics such as traffic patterns and user behavior provide insight into how users interact with your application. These metrics together enable you to spot trends, identify problems, and make informed, data-driven decisions.
-
-:::note
-Custom Dashboards is currently available to customers participating in the Log Explorer beta. To begin using custom dashboards, you will first need to request access to [Log Explorer](/logs/log-explorer/).
-:::
-
-## Create a new dashboard
-
-To create a new dashboard, log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account. Then, navigate to **Analytics & Logs** and open the **Dashboards** section.
-
-When creating a dashboard, you have two options: building one from scratch or using a pre-designed template.
-
-- A **from-scratch dashboard** gives you full control over its structure, allowing you to choose the exact datasets, metrics, and visualizations that fit your needs. This approach is ideal if you have specific monitoring goals or need a highly customized view of your data.
-- On the other hand, **templates** provide a faster way to set up a dashboard with commonly used metrics and charts. They are useful for standard use cases, such as monitoring security threats, API performance, or bot traffic. Templates help you get started quickly while still allowing modifications to fit your requirements.
-
-Choosing between these options depends on whether you need a quick setup with predefined insights or a fully customized dashboard tailored to your unique analysis needs.
-
-### Create a dashboard from scratch
-
-When creating a dashboard from scratch, select the option **Create a new**. You can follow the instructions in the following sections to start adding charts to your dashboard.
-
-#### Create a new chart
-
-To create a new chart, select **Add chart**. There are two ways to create a chart:
-
-- **Use a prompt**: Enter a query like `Compare status code ranges over time.` The AI model decides the most appropriate visualization and constructs your chart configuration.
-- **Customize your chart**: Select the chart elements manually, including the chart type, title, dataset to query, metrics, and filters. This option gives you full control over your chart’s structure.
-
-Refer to the following sections for more information about the charts, datasets, fields, metrics, and filters available.
-
-##### Chart types
-
-The available chart types include:
-
-- **Timeseries**: Displays trends over time, enabling comparisons across multiple series.
-- **Categorical**: Compares proportions across different series.
-- **Stat**: Highlights a single value, showing its delta and sparkline for quick insights.
-- **Percentage**: Represents one value as a percentage of another.
-- **Top N**: Identifies the highest-ranking values for a given attribute.
-
-##### Datasets and metrics
-
-The available metrics and filters vary based on the dataset you want to use. For example, when using the HTTP Requests dataset, you can select **origin response duration** as a metric. You can then choose your preferred aggregation method for that metric, such as total, median, or quantiles. The following table outlines the datasets, fields, and available metrics:
-
-
-| Dataset | Field | Definition | Metrics |
-|-----------------|-----------------|------------|---------|
-| HTTP Requests | Requests | The number of requests sent by a client to a server over the HTTP protocol. | Total |
-| | DNS Response Time | The time taken for a DNS query to be resolved, measured from when a request is made to when a response is received. | Total, Average, Median, 95th percentile, 99th percentile |
-| | Time to First Byte | The duration from when a request is made to when the first byte of the response is received from the server. | Total, Average, Median, 95th percentile, 99th percentile |
-| | Bytes returned to the Client | The amount of data (in bytes) sent from the server to the client in response to requests. | Total, Average, Median, 95th percentile, 99th percentile |
-| | Number of visits | Unique visits or sessions to a website or application. | Total |
-| | Origin response duration | The time taken by the origin server to process and respond to a request. | Total, Average, Median, 95th percentile, 99th percentile |
-| Security Events | Security events | Actions taken by Application Security products such as WAF and Bot Management. | Total |
-
-##### Filters
-
-You can also adjust the scope of your analytics by entering filter conditions. This allows you to focus on the most relevant data.
-
-1. Select **Add filter**.
-2. Select a **field**, an **operator**, and a **value**. For example, to filter events by source IP address, select the _Source IP_ field, select the _equals_ operator, and enter the _IP address_.
-3. Select **Apply**.
-
-### Create a dashboard from a template
-
-Alternatively, you can choose to create your dashboard using a pre-designed dashboard template. The templates available are:
-
-- **Bot monitoring**: Allows you to identify automated traffic accessing your website.
-- **API Security**: Allows you to monitor data transfers and exceptions for API endpoints in your application.
-- **API Performance**: Allows you to view timing data for API endpoints in your application, along with error rates.
-- **Account takeover**: Allows you to monitor login attempts, usage of leaked credentials, and account takeover attacks.
-
-## Edit a dashboard or chart
-
-After creating your dashboard, to view your saved dashboards, select **Back to all dashboards** to access the full list. Regardless of the way you choose to create your dashboard, you can always edit existing charts and add new ones as needed.
-
-## Further analysis
-
-For each chart, you can:
-
-- Review related traffic in [Security Analytics](/waf/analytics/security-analytics/).
-- Explore detailed logs in [Log Explorer](/logs/log-explorer/).
-
-This ensures deeper insights into your application's security, performance, and usage patterns.
diff --git a/src/content/docs/cloudflare-one/insights/logs/index.mdx b/src/content/docs/cloudflare-one/insights/logs/index.mdx
index e5e9b5b450537eb..1386a68cee79570 100644
--- a/src/content/docs/cloudflare-one/insights/logs/index.mdx
+++ b/src/content/docs/cloudflare-one/insights/logs/index.mdx
@@ -36,7 +36,7 @@ Log Explorer users can store Zero Trust logs directly within Cloudflare in an [R
-For more information, refer to [Log Explorer](/logs/log-explorer/).
+For more information, refer to [Log Explorer](/log-explorer/).
## Customer Metadata Boundary
diff --git a/src/content/docs/logs/R2-log-retrieval.mdx b/src/content/docs/logs/R2-log-retrieval.mdx
index 9172d6af4c3c864..1b3a6e656733336 100644
--- a/src/content/docs/logs/R2-log-retrieval.mdx
+++ b/src/content/docs/logs/R2-log-retrieval.mdx
@@ -11,7 +11,7 @@ Logs Engine gives you the ability to store your logs in R2 and query them direct
:::note
-Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/logs/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/).
+Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/).
:::
## Store logs in R2
diff --git a/src/content/docs/logs/log-explorer.mdx b/src/content/docs/logs/log-explorer.mdx
deleted file mode 100644
index 381134f43d498f6..000000000000000
--- a/src/content/docs/logs/log-explorer.mdx
+++ /dev/null
@@ -1,290 +0,0 @@
----
-pcx_content_type: concept
-title: Log Explorer
-sidebar:
- order: 118
- badge:
- text: Beta
----
-
-import { TabItem, Tabs, Render } from "~/components";
-
-Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare Dashboard or API. Giving you visibility into your logs without the need to forward them to third parties. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the Dashboard or SQL API.
-
-:::note
-
-Log Explorer is currently in beta. To request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/).
-
-:::
-
-## Supported datasets
-
-Log Explorer is available at the account and zone level. At the zone level, datasets currently available are:
-
-- [HTTP requests](/logs/reference/log-fields/zone/http_requests/) (`FROM http_requests`)
-- [Firewall events](/logs/reference/log-fields/zone/firewall_events/) (`FROM firewall_events`)
-
-At the account level, the datasets available are:
-
-
-
-## Authentication
-
-Log Explorer is available to users with the following permissions:
-
-- **Logs Edit**: users with Logs Edit permissions can enable datasets.
-- **Logs Read**: users with Logs Read permissions can run queries via the UI or API.
-
-Note that these permissions exist at the account and zone level and you need the appropriate permission level for the datasets you wish to query.
-
-Authentication with the API can be done via an authentication header or API token. Append your API call with either of the following additional parameters.
-
-- **Authentication header**
-
- - `X-Auth-Email` - the Cloudflare account email address associated with the domain
- - `X-Auth-Key` - the Cloudflare API key
-
-- **API token**
-
- - `Authorization: Bearer ` To create an appropriately scoped API token, refer to [Create API token](/fundamentals/api/get-started/create-token/) documentation. Copy and paste the token into the authorization parameter for your API call.
-
-## Enable Log Explorer
-
-In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API.
-
-
-
-1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account or domain (also known as zone).
-2. Go to **Analytics & Logs** > **Log Explorer**.
-3. Select **Enable a dataset** to select the datasets you want to query. You can enable more datasets later.
-
-:::note
-
-It may take a few minutes for the logs to become available for querying.
-:::
-
-
-
-Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.
-
-The following curl command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
-
-```bash
-curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/datasets \
---header "Authorization: Bearer " \
---header "Content-Type: application/json" \
---data '{
- "dataset": "http_requests"
-}'
-```
-
-```json
-{
- "result": {
- "dataset": "http_requests",
- "object_type": "zone",
- "object_id": "",
- "created_at": "2025-06-03T14:33:16Z",
- "updated_at": "2025-06-03T14:33:16Z",
- "dataset_id": "01973635f7e273a1964a02f4d4502499",
- "enabled": true
- },
- "success": true,
- "errors": [],
- "messages": []
-}
-```
-
-If you would like to enable an account-level dataset, replace `zones/{zone_id}` with `accounts/{account_id}` in the curl command. For example:
-
-```bash
-curl https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/explorer/datasets \
---header "Authorization: Bearer " \
---header "Content-Type: application/json" \
---data '{
- "dataset": "access_requests"
-}'
-```
-
-
-
-## Use Log Explorer
-
-Filtering and viewing your logs is available via the Cloudflare Dashboard or via query API.
-
-
-
-1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account or domain (also known as zone).
-2. Go to **Analytics & Logs** > **Log Explorer**.
-3. From the dropdown, select the **Dataset** you want to use.
-4. Select a **Limit**. That is the maximum number of results to return, for example, 50.
-5. Select the **Time period** from which you want to query, for example, the previous 12 hours.
-6. Select **Add filter** to create your query. Select a **Field**, an **Operator**, and a **Value**.
-7. A query preview is displayed. Select **Use custom SQL**, if you would like to change it.
-8. Select **Run query** when you are done. The results are displayed below within the **Query results** section.
-
-:::note
-
-You can also access the Log Explorer dashboard directly from the [Security Analytics dashboard](/waf/analytics/security-analytics/#logs). When doing so, the filters you applied in Security Analytics will automatically carry over to your query in Log Explorer.
-
-:::
-
-
-
-Log Explorer exposes a query endpoint that uses a familiar SQL syntax for querying your logs generated with Cloudflare's network.
-
-For example, to find an HTTP request with a specific [Ray ID](/fundamentals/reference/cloudflare-ray-id/), you can perform the following SQL query.
-
-```bash
-curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/query/sql \
---header "Authorization: Bearer " \
---url-query query="SELECT clientRequestScheme, clientRequestHost, clientRequestMethod, edgeResponseStatus, clientRequestUserAgent FROM http_requests WHERE RayID = '806c30a3cec56817' LIMIT 1"
-```
-
-Which returns the following HTTP request details:
-
-```json
-{
- "result": [
- {
- "clientrequestscheme": "https",
- "clientrequesthost": "example.com",
- "clientrequestmethod": "GET",
- "clientrequestuseragent": "curl/7.88.1",
- "edgeresponsestatus": 200
- }
- ],
- "success": true,
- "errors": [],
- "messages": []
-}
-```
-
-For another example using an account-level dataset, to find Cloudflare Access requests with selected columns from a specific timeframe, you can perform the following SQL query.
-
-```bash
-curl https://api.cloudflare.com/client/v4/account/{account_id}/logs/explorer/query/sql \
---header "Authorization: Bearer " \
---url-query query="SELECT CreatedAt, AppDomain, AppUUID, Action, Allowed, Country, RayID, Email, IPAddress, UserUID FROM access_requests WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'"
-```
-
-Which returns the following request details:
-
-```json
-{
- "result": [
- {
- "createdat": "2025-01-14T18:17:55Z",
- "appdomain": "example.com",
- "appuuid": "a66b4ab0-ccdf-4d60-a6d0-54a59a827d92",
- "action": "login",
- "allowed": true,
- "country": "us",
- "rayid": "90fbb07c0b316957",
- "email": "user@example.com",
- "ipaddress": "1.2.3.4",
- "useruid": "52859e81-711e-4de0-8b31-283336060e79"
- }
- ],
- "success": true,
- "errors": [],
- "messages": []
-}
-```
-
-
-
-## Output formats
-
-Log Explorer output can be presented in different formats, besides JSON: JSON Lines (also known as NDJSON), CSV, and plain text. The plain text uses ASCII tables similar to psql's `aligned` output mode. Besides the convenience factor of not having to translate the format on the client side, JSON Lines, CSV, and plain text formats have the advantage of being streamed from the API. So for large result sets, you will get a response earlier.
-
-You can choose the output format with an HTTP `Accept` header, as shown in the table below:
-
-| Output format | Content type | Streaming? |
-| ------------- | ---------------------- | ---------- |
-| JSON | `application/json` | No |
-| JSON Lines | `application/x-ndjson` | Yes |
-| CSV | `text/csv` | Yes |
-| Plain text | `text/plain` | Yes |
-
-## Optimize your queries
-
-All the tables supported by Log Explorer contain a special column called `date`, which helps to narrow down the amount of data that is scanned to respond to your query, resulting in faster query response times. The value of `date` must be in the form of `YYYY-MM-DD`. For example, to query logs that occurred on October 12, 2023, add the following to your `WHERE` clause: `date = '2023-10-12'`. The column supports the standard operators of `<`, `>`, and `=`.
-
-```bash
-curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/query/sql \
---header "Authorization: Bearer " \
---url-query query="SELECT clientRequestMethod, clientRequestPath, clientRequestProtocol FROM http_requests WHERE date = '2023-10-12' LIMIT 500"
-```
-
-### Additional query optimization tips
-
-- Narrow your query time frame. Focus on a smaller time window to reduce the volume of data processed. This helps avoid querying excessive amounts of data and speeds up response times.
-- Omit `ORDER BY` and `LIMIT` clauses. These clauses can slow down queries, especially when dealing with large datasets. For queries that return a large number of records, reduce the time frame instead of limiting to the newest `N` records from a broader time frame.
-- Select only necessary columns. For example, replace `SELECT *` with the list of specific columns you need. You can also use `SELECT RayId` as a first iteration and follow up with a query that filters by the Ray IDs to retrieve additional columns. Additionally, you can use `SELECT COUNT(*)` to probe for time frames with matching records without retrieving the full dataset.
-
-## SQL queries supported
-
-These are the SQL query clauses supported by Log Explorer.
-
-### SELECT
-
-The `SELECT` clause specifies the columns that you want to retrieve from the database tables. It can include individual column names, expressions, or even wildcard characters to select all columns.
-
-### FROM
-
-The `FROM` clause specifies the tables from which to retrieve data. It indicates the source of the data for the `SELECT` statement.
-
-### WHERE
-
-The `WHERE` clause filters the rows returned by a query based on specified conditions. It allows you to specify conditions that must be met for a row to be included in the result set.
-
-### GROUP BY
-
-The `GROUP BY` clause is used to group rows that have the same values into summary rows.
-
-### HAVING
-
-The `HAVING` clause is similar to the `WHERE` clause but is used specifically with the `GROUP BY` clause. It filters groups of rows based on specified conditions after the `GROUP BY` operation has been performed.
-
-### ORDER BY
-
-The `ORDER BY` clause is used to sort the result set by one or more columns in ascending or descending order.
-
-### LIMIT
-
-The `LIMIT` clause is used to constrain the number of rows returned by a query. It is often used in conjunction with the `ORDER BY` clause to retrieve the top N rows or to implement pagination.
-
-:::note
-
-Log Explorer does not support `JOINs`, `DDL`, `DML`, or `EXPLAIN` queries.
-
-:::
-
-## FAQs
-
-### Which fields (or columns) are available for querying?
-
-All fields listed in the datasets [Log Fields](/logs/reference/log-fields/) are viewable in Log Explorer. For filtering, only fields with simple values, such as those of type `bool`, `int`, `float`, or `string` are supported. Fields with key-value pairs are currently not supported. For example, you cannot use the fields `RequestHeaders` and `Cookies` from the HTTP requests dataset in a filter.
-
-### Why does my query not complete or time out?
-
-Log Explorer performs best when query parameters focus on narrower ranges of time. You may experience query timeouts when your query would return a large quantity of data. Consider refining your query to improve performance.
-
-If your query times out with an HTTP status of 524 (Gateway Timeout), consider using one of the [streaming output formats](/logs/log-explorer/#output-formats), such as `application/x-ndjson`.
-
-### Why don't I see any logs in my queries after enabling the dataset?
-
-Log Explorer starts ingesting logs from the moment you enable the dataset. It will not display logs for events that occurred before the dataset was enabled. Make sure that new events have been generated since enabling the dataset, and check again.
-
-### My query returned an error. How do I figure out what went wrong?
-
-We are actively working on improving error codes. If you receive a generic error, check your SQL syntax (if you are using the custom SQL feature), make sure you have included a date and a limit, and that the field you are filtering is not a key-value pair. If the query still fails it is likely timing out. Try refining your filters.
-
-### Where is the data stored?
-
-The data is stored in Cloudflare R2. Each Log Explorer dataset is stored on a per-customer level, similar to Cloudflare D1, ensuring that your data is kept separate from that of other customers. In the future, this single-tenant storage model will provide you with the flexibility to create your own retention policies and decide in which regions you want to store your data.
-
-### Does Log Explorer support Customer Metadata Boundary?
-
-Customer metadata boundary is currently not supported for Log Explorer.
diff --git a/src/content/docs/waf/analytics/security-analytics.mdx b/src/content/docs/waf/analytics/security-analytics.mdx
index 3b98e0340fd7371..d0fdfee1ee0c62a 100644
--- a/src/content/docs/waf/analytics/security-analytics.mdx
+++ b/src/content/docs/waf/analytics/security-analytics.mdx
@@ -120,7 +120,7 @@ The main chart displays the following data for the selected time frame, accordin
Security Analytics shows request logs for the selected time frame and applied filters, along with detailed information and security analyses of those requests.
-By default, Security Analytics uses sampled logs for the logs table. If you are subscribed to [Log Explorer](/logs/log-explorer/), you may also have access to [raw logs](#raw-logs).
+By default, Security Analytics uses sampled logs for the logs table. If you are subscribed to [Log Explorer](/log-explorer/), you may also have access to [raw logs](#raw-logs).
#### Sampled logs
@@ -155,7 +155,7 @@ To switch from raw logs back to sampled logs, select **Switch back to sampled lo
##### Query raw logs using Log Explorer
-You can switch to [Log Explorer](/logs/log-explorer/) to dive deeper on your analysis while applying the same filters you used in Security Analytics. Raw logs in Security Analytics are based on the same data source used in Log Explorer.
+You can switch to [Log Explorer](/log-explorer/) to dive deeper on your analysis while applying the same filters you used in Security Analytics. Raw logs in Security Analytics are based on the same data source used in Log Explorer.
:::note[Note]
Currently, changing the time frame or the applied filters while showing raw logs may cause the Cloudflare dashboard to switch automatically to sampled logs. This happens if the total number of request logs for the selected time frame is high.