Skip to content

Commit 69d55b6

Browse files
Apply suggestions from code review
Co-authored-by: Pedro Sousa <[email protected]>
1 parent 9f023c6 commit 69d55b6

File tree

5 files changed

+73
-84
lines changed

5 files changed

+73
-84
lines changed

src/content/docs/log-explorer/api.mdx

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -27,22 +27,21 @@ Authentication with the API can be done via an authentication header or API toke
2727

2828
- `Authorization: Bearer <API_TOKEN>` To create an appropriately scoped API token, refer to [Create API token](/fundamentals/api/get-started/create-token/) documentation. Copy and paste the token into the authorization parameter for your API call.
2929

30-
## Manage Datasets
30+
## Enable datasets
3131

3232
Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.
3333

34-
The following curl command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
34+
The following `curl` command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
3535

3636
```bash
3737
curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/datasets \
3838
--header "Authorization: Bearer <API_TOKEN>" \
39-
--header "Content-Type: application/json" \
40-
--data '{
39+
--json '{
4140
"dataset": "http_requests"
4241
}'
4342
```
4443

45-
```json
44+
```json output
4645
{
4746
"result": {
4847
"dataset": "http_requests",
@@ -59,13 +58,12 @@ curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/datasets
5958
}
6059
```
6160

62-
If you would like to enable an account-level dataset, replace `zones/{zone_id}` with `accounts/{account_id}` in the curl command. For example:
61+
If you would like to enable an account-level dataset, replace `zones/{zone_id}` with `accounts/{account_id}` in the `curl` command. For example:
6362

6463
```bash
6564
curl https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/explorer/datasets \
6665
--header "Authorization: Bearer <API_TOKEN>" \
67-
--header "Content-Type: application/json" \
68-
--data '{
66+
--json '{
6967
"dataset": "access_requests"
7068
}'
7169
```
@@ -74,15 +72,15 @@ curl https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/explorer/da
7472

7573
Log Explorer includes a SQL API for submitting queries.
7674

77-
For example, to find an HTTP request with a specific [Ray ID](/fundamentals/reference/cloudflare-ray-id/), and use the following SQL query:
75+
For example, to find an HTTP request with a specific [Ray ID](/fundamentals/reference/cloudflare-ray-id/), use the following SQL query:
7876

7977
```bash
8078
curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/query/sql \
8179
--header "Authorization: Bearer <API_TOKEN>" \
8280
--url-query query="SELECT clientRequestScheme, clientRequestHost, clientRequestMethod, edgeResponseStatus, clientRequestUserAgent FROM http_requests WHERE RayID = '806c30a3cec56817' LIMIT 1"
8381
```
8482

85-
Which returns the following HTTP request details:
83+
This command returns the following HTTP request details:
8684

8785
```json
8886
{
@@ -101,15 +99,15 @@ Which returns the following HTTP request details:
10199
}
102100
```
103101

104-
Another example to find Cloudflare Access requests with selected columns from a specific timeframe, you can perform the following SQL query:
102+
As another example, you could find Cloudflare Access requests with selected columns from a specific timeframe by performing the following SQL query:
105103

106104
```bash
107105
curl https://api.cloudflare.com/client/v4/account/{account_id}/logs/explorer/query/sql \
108106
--header "Authorization: Bearer <API_TOKEN>" \
109107
--url-query query="SELECT CreatedAt, AppDomain, AppUUID, Action, Allowed, Country, RayID, Email, IPAddress, UserUID FROM access_requests WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'"
110108
```
111109

112-
Which returns the following request details:
110+
This command returns the following request details:
113111

114112
```json
115113
{

src/content/docs/log-explorer/custom-dashboards.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ The available metrics and filters vary based on the dataset you want to use. For
7373
You can also adjust the scope of your analytics by entering filter conditions. This allows you to focus on the most relevant data.
7474

7575
1. Select **Add filter**.
76-
2. Select a **field**, an **operator**, and a **value**. For example, to filter events by source IP address, select the _Source IP_ field, select the _equals_ operator, and enter the _IP address_.
76+
2. Select a **field**, an **operator**, and a **value**. For example, to filter events by source IP address, select the _Source IP_ field, select the _equals_ operator, and enter the IP address.
7777
3. Select **Apply**.
7878

7979
### Create a dashboard from a template

src/content/docs/log-explorer/index.mdx

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,7 @@ import { Description, Feature, RelatedProduct } from "~/components"
1111
Store and explore your Cloudflare logs directly within the Cloudflare dashboard or API.
1212
</Description>
1313

14-
15-
Log Explorer is Cloudflare's native observability and forensics product that enables security teams and developers to analyze, investigate, and monitor issues directly from the Cloudflare dashboard, without the expense and complexity of forwarding logs to third party tools.
14+
Log Explorer is Cloudflare's native observability and forensics product that enables security teams and developers to analyze, investigate, and monitor issues directly from the Cloudflare dashboard, without the expense and complexity of forwarding logs to third-party tools.
1615

1716
Log Explorer provides access to Cloudflare logs with all the context available within the Cloudflare platform. You can monitor security and performance issues with custom dashboards or investigate and troubleshoot issues with log search. Benefits include:
1817

@@ -24,11 +23,11 @@ Log Explorer provides access to Cloudflare logs with all the context available w
2423
## Features
2524

2625
<Feature header="Log Search" href="/log-explorer/log-search/">
27-
Search logs enable you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or [API](/log-explorer/api/).
26+
Explore your Cloudflare logs directly within the Cloudflare dashboard or [API](/log-explorer/api/).
2827
</Feature>
2928

3029
<Feature header="Custom dashboards" href="/log-explorer/custom-dashboards/">
31-
Custom dashboards enable you to design customized views for tracking application security, performance, and usage metrics.
30+
Design customized views for tracking application security, performance, and usage metrics.
3231
</Feature>
3332

3433
<Feature header="Manage datasets" href="/log-explorer/manage-datasets/">
@@ -42,9 +41,9 @@ Manage configuration and perform queries via the API.
4241
## Related products
4342

4443
<RelatedProduct header="Logpush" href="/logs/" product="logs">
45-
Forward Cloudflare logs to third party tools for debugging, identifying configuration adjustments, and creating analytics.
44+
Forward Cloudflare logs to third-party tools for debugging, identifying configuration adjustments, and creating analytics dashboards.
4645
</RelatedProduct>
4746

4847
<RelatedProduct header="Analytics" href="/analytics/" product="analytics">
49-
Cloudflare visualizes the metadata collected by our products in the Cloudflare dashboard.
48+
Visualize the metadata collected by our products in the Cloudflare dashboard.
5049
</RelatedProduct>

src/content/docs/log-explorer/log-search.mdx

Lines changed: 50 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -7,15 +7,15 @@ sidebar:
77

88
import { TabItem, Tabs, Render } from "~/components";
99

10-
Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or API. Giving you visibility into your logs without the need to forward them to third parties. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the Dashboard or SQL API.
10+
Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or API, giving you visibility into your logs without the need to forward them to third-party services. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the dashboard or SQL API.
1111

1212
## SQL queries supported
1313

14-
The diagram below displays the example sql grammar for SELECT statements as a railroad syntax diagram:
14+
The diagram below displays the example sql grammar for `SELECT` statements as a railroad syntax diagram:
1515

16-
![Supported sql grammar](~/assets/images/log-explorer/supported-sql-grammar-graph.png)
16+
![Supported SQL grammar](~/assets/images/log-explorer/supported-sql-grammar-graph.png)
1717

18-
Any path from left to right forms a valid query. There is a limit of 25 predicates in the `WHERE` clause. Predicates can be grouped using parenthesis. If the `LIMIT` clause is not specified, then the default limit of 10000 is applied. The maximum number for the `LIMIT` clause is 10000. Results are returned in descending order by time.
18+
Any path from left to right forms a valid query. There is a limit of 25 predicates in the `WHERE` clause. Predicates can be grouped using parenthesis. If the `LIMIT` clause is not specified, then the default limit of 10,000 is applied. The maximum number for the `LIMIT` clause is 10,000. Results are returned in descending order by time.
1919

2020
Examples of queries include:
2121

@@ -44,11 +44,11 @@ The `ORDER BY` clause is used to sort the result set by one or more columns in a
4444

4545
### LIMIT
4646

47-
The `LIMIT` clause is used to constrain the number of rows returned by a query. It is often used in conjunction with the `ORDER BY` clause to retrieve the top N rows or to implement pagination.
47+
The `LIMIT` clause is used to constrain the number of rows returned by a query. It is often used in conjunction with the `ORDER BY` clause to retrieve the top `N` rows or to implement pagination.
4848

4949
:::note
5050

51-
Log Explorer does not support `JOINs`, `DDL`, `DML`, or `EXPLAIN` queries.
51+
Log Explorer does not support `JOIN`, `DDL`, `DML`, or `EXPLAIN` queries.
5252

5353
:::
5454

@@ -57,7 +57,7 @@ Log Explorer does not support `JOINs`, `DDL`, `DML`, or `EXPLAIN` queries.
5757
You can filter and view your logs via the Cloudflare dashboard or the API.
5858

5959
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account.
60-
2. Go to **Log Explorer** > **Log Search**.
60+
2. Go to **Log Explorer** > **Log Search**.
6161
3. Select the **Dataset** you want to use and in **Columns** select the dataset fields. If you selected a zone scoped dataset, select the zone you would like to use.
6262
4. Enter a **Limit**. A limit is the maximum number of results to return, for example, 50.
6363
5. Select the **Time period** from which you want to query, for example, the previous 12 hours.
@@ -73,47 +73,41 @@ You can filter and view your logs via the Cloudflare dashboard or the API.
7373
For example, to find an HTTP request with a specific [Ray ID](/fundamentals/reference/cloudflare-ray-id/), go to **Custom SQL**, and enter the following SQL query:
7474

7575
```sql
76-
SELECT
77-
clientRequestScheme,
78-
clientRequestHost,
79-
clientRequestMethod,
80-
edgeResponseStatus,
81-
clientRequestUserAgent
82-
FROM http_requests
83-
WHERE RayID = '806c30a3cec56817'
84-
LIMIT 1
76+
SELECT
77+
clientRequestScheme,
78+
clientRequestHost,
79+
clientRequestMethod,
80+
edgeResponseStatus,
81+
clientRequestUserAgent
82+
FROM http_requests
83+
WHERE RayID = '806c30a3cec56817'
84+
LIMIT 1
8585
```
8686

8787

88-
Another example to find Cloudflare Access requests with selected columns from a specific timeframe, you can perform the following SQL query:
88+
As another example, to find Cloudflare Access requests with selected columns from a specific timeframe you could perform the following SQL query:
8989

9090
```sql
91-
SELECT
92-
CreatedAt,
93-
AppDomain,
94-
AppUUID,
95-
Action,
96-
Allowed,
97-
Country,
98-
RayID,
99-
Email,
100-
IPAddress,
101-
UserUID
102-
FROM access_requests
103-
WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'
91+
SELECT
92+
CreatedAt,
93+
AppDomain,
94+
AppUUID,
95+
Action,
96+
Allowed,
97+
Country,
98+
RayID,
99+
Email,
100+
IPAddress,
101+
UserUID
102+
FROM access_requests
103+
WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'
104104
```
105105

106-
107-
108-
109-
110-
111-
112106
### Save queries
113107

114108
After selecting all the fields for your query, you can save it by selecting **Save query**. Provide a name and description to help identify it later. To view your saved and recent queries, select **Queries** — they will appear in a side panel where you can insert a new query, or delete any query.
115109

116-
## Integrated with Security Analytics
110+
## Integration with Security Analytics
117111

118112
You can also access the Log Explorer dashboard directly from the [Security Analytics dashboard](/waf/analytics/security-analytics/#logs). When doing so, the filters you applied in Security Analytics will automatically carry over to your query in Log Explorer.
119113

@@ -124,26 +118,27 @@ You can also access the Log Explorer dashboard directly from the [Security Analy
124118
All the tables supported by Log Explorer contain a special column called `date`, which helps to narrow down the amount of data that is scanned to respond to your query, resulting in faster query response times. The value of `date` must be in the form of `YYYY-MM-DD`. For example, to query logs that occurred on October 12, 2023, add the following to your `WHERE` clause: `date = '2023-10-12'`. The column supports the standard operators of `<`, `>`, and `=`.
125119

126120
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account.
127-
2. Go to **Log Explorer** > **Log Search** > **Custom SQL**.
121+
2. Go to **Log Explorer** > **Log Search** > **Custom SQL**.
128122
3. Enter the following SQL query:
129123

130124
```sql
131-
SELECT
132-
clientip,
133-
clientrequesthost,
134-
clientrequestmethod,
135-
clientrequesturi,
136-
edgeendtimestamp,
137-
edgeresponsestatus,
138-
originresponsestatus,
139-
edgestarttimestamp,
140-
rayid,
141-
clientcountry,
142-
clientrequestpath, date
143-
FROM
144-
http_requests
145-
WHERE
146-
date = '2023-10-12' LIMIT 500
125+
SELECT
126+
clientip,
127+
clientrequesthost,
128+
clientrequestmethod,
129+
clientrequesturi,
130+
edgeendtimestamp,
131+
edgeresponsestatus,
132+
originresponsestatus,
133+
edgestarttimestamp,
134+
rayid,
135+
clientcountry,
136+
clientrequestpath,
137+
date
138+
FROM
139+
http_requests
140+
WHERE
141+
date = '2023-10-12' LIMIT 500
147142
```
148143

149144
### Additional query optimization tips
@@ -156,7 +151,7 @@ All the tables supported by Log Explorer contain a special column called `date`,
156151

157152
### Which fields (or columns) are available for querying?
158153

159-
All fields listed in the datasets [Log Fields](/logs/reference/log-fields/) are viewable in Log Explorer. For filtering, only fields with simple values, such as those of type `bool`, `int`, `float`, or `string` are supported. Fields with key-value pairs are currently not supported. For example, you cannot use the fields `RequestHeaders` and `Cookies` from the HTTP requests dataset in a filter.
154+
All fields listed in [Log Fields](/logs/reference/log-fields/) for the [supported datasets](/log-explorer/manage-datasets/#supported-datasets) are viewable in Log Explorer. For filtering, only fields with simple values, such as those of type `bool`, `int`, `float`, or `string` are supported. Fields with key-value pairs are currently not supported. For example, you cannot use the fields `RequestHeaders` and `Cookies` from the HTTP requests dataset in a filter.
160155

161156
### Why does my query not complete or time out?
162157

src/content/docs/log-explorer/manage-datasets.mdx

Lines changed: 7 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -11,12 +11,11 @@ Log Explorer allows you to enable or disable which datasets are available to que
1111

1212
## Supported datasets
1313

14-
Log Explorer currently supports:
14+
Log Explorer currently supports the following datasets:
1515

1616
- [HTTP requests](/logs/reference/log-fields/zone/http_requests/) (`FROM http_requests`)
1717
- [Firewall events](/logs/reference/log-fields/zone/firewall_events/) (`FROM firewall_events`)
1818

19-
2019
## Enable Log Explorer
2120

2221
In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API.
@@ -30,20 +29,19 @@ In order for Log Explorer to begin storing logs, you need to enable the desired
3029
It may take a few minutes for the logs to become available for querying.
3130
:::
3231

33-
Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.
32+
If you are using the API, Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.
3433

35-
The following curl command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
34+
The following `curl` command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
3635

3736
```bash
3837
curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/datasets \
3938
--header "Authorization: Bearer <API_TOKEN>" \
40-
--header "Content-Type: application/json" \
41-
--data '{
39+
--json '{
4240
"dataset": "http_requests"
4341
}'
4442
```
4543

46-
```json
44+
```json output
4745
{
4846
"result": {
4947
"dataset": "http_requests",
@@ -60,13 +58,12 @@ curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/datasets
6058
}
6159
```
6260

63-
If you would like to enable an account-level dataset, replace `zones/{zone_id}` with `accounts/{account_id}` in the curl command. For example:
61+
To enable an account-level dataset, replace `zones/{zone_id}` with `accounts/{account_id}` in the `curl` command. For example:
6462

6563
```bash
6664
curl https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/explorer/datasets \
6765
--header "Authorization: Bearer <API_TOKEN>" \
68-
--header "Content-Type: application/json" \
69-
--data '{
66+
--json '{
7067
"dataset": "access_requests"
7168
}'
7269
```

0 commit comments

Comments
 (0)