You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/log-explorer/api.mdx
+10-12Lines changed: 10 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,22 +27,21 @@ Authentication with the API can be done via an authentication header or API toke
27
27
28
28
-`Authorization: Bearer <API_TOKEN>` To create an appropriately scoped API token, refer to [Create API token](/fundamentals/api/get-started/create-token/) documentation. Copy and paste the token into the authorization parameter for your API call.
29
29
30
-
## Manage Datasets
30
+
## Enable datasets
31
31
32
32
Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.
33
33
34
-
The following curl command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
34
+
The following `curl` command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
--url-query query="SELECT CreatedAt, AppDomain, AppUUID, Action, Allowed, Country, RayID, Email, IPAddress, UserUID FROM access_requests WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'"
110
108
```
111
109
112
-
Which returns the following request details:
110
+
This command returns the following request details:
Copy file name to clipboardExpand all lines: src/content/docs/log-explorer/custom-dashboards.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -73,7 +73,7 @@ The available metrics and filters vary based on the dataset you want to use. For
73
73
You can also adjust the scope of your analytics by entering filter conditions. This allows you to focus on the most relevant data.
74
74
75
75
1. Select **Add filter**.
76
-
2. Select a **field**, an **operator**, and a **value**. For example, to filter events by source IP address, select the _Source IP_ field, select the _equals_ operator, and enter the _IP address_.
76
+
2. Select a **field**, an **operator**, and a **value**. For example, to filter events by source IP address, select the _Source IP_ field, select the _equals_ operator, and enter the IP address.
Store and explore your Cloudflare logs directly within the Cloudflare dashboard or API.
12
12
</Description>
13
13
14
-
15
-
Log Explorer is Cloudflare's native observability and forensics product that enables security teams and developers to analyze, investigate, and monitor issues directly from the Cloudflare dashboard, without the expense and complexity of forwarding logs to third party tools.
14
+
Log Explorer is Cloudflare's native observability and forensics product that enables security teams and developers to analyze, investigate, and monitor issues directly from the Cloudflare dashboard, without the expense and complexity of forwarding logs to third-party tools.
16
15
17
16
Log Explorer provides access to Cloudflare logs with all the context available within the Cloudflare platform. You can monitor security and performance issues with custom dashboards or investigate and troubleshoot issues with log search. Benefits include:
18
17
@@ -24,11 +23,11 @@ Log Explorer provides access to Cloudflare logs with all the context available w
Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or API. Giving you visibility into your logs without the need to forward them to third parties. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the Dashboard or SQL API.
10
+
Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or API, giving you visibility into your logs without the need to forward them to third-party services. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the dashboard or SQL API.
11
11
12
12
## SQL queries supported
13
13
14
-
The diagram below displays the example sql grammar for SELECT statements as a railroad syntax diagram:
14
+
The diagram below displays the example sql grammar for `SELECT` statements as a railroad syntax diagram:
Any path from left to right forms a valid query. There is a limit of 25 predicates in the `WHERE`clause. Predicates can be grouped using parenthesis. If the `LIMIT`clause is not specified, then the default limit of 10000 is applied. The maximum number for the `LIMIT` clause is 10000. Results are returned in descending order by time.
18
+
Any path from left to right forms a valid query. There is a limit of 25 predicates in the `WHERE` clause. Predicates can be grouped using parenthesis. If the `LIMIT` clause is not specified, then the default limit of 10,000 is applied. The maximum number for the `LIMIT` clause is 10,000. Results are returned in descending order by time.
19
19
20
20
Examples of queries include:
21
21
@@ -44,11 +44,11 @@ The `ORDER BY` clause is used to sort the result set by one or more columns in a
44
44
45
45
### LIMIT
46
46
47
-
The `LIMIT` clause is used to constrain the number of rows returned by a query. It is often used in conjunction with the `ORDER BY` clause to retrieve the top N rows or to implement pagination.
47
+
The `LIMIT` clause is used to constrain the number of rows returned by a query. It is often used in conjunction with the `ORDER BY` clause to retrieve the top `N` rows or to implement pagination.
48
48
49
49
:::note
50
50
51
-
Log Explorer does not support `JOINs`, `DDL`, `DML`, or `EXPLAIN` queries.
51
+
Log Explorer does not support `JOIN`, `DDL`, `DML`, or `EXPLAIN` queries.
52
52
53
53
:::
54
54
@@ -57,7 +57,7 @@ Log Explorer does not support `JOINs`, `DDL`, `DML`, or `EXPLAIN` queries.
57
57
You can filter and view your logs via the Cloudflare dashboard or the API.
58
58
59
59
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account.
60
-
2. Go to **Log Explorer** > **Log Search**.
60
+
2. Go to **Log Explorer** > **Log Search**.
61
61
3. Select the **Dataset** you want to use and in **Columns** select the dataset fields. If you selected a zone scoped dataset, select the zone you would like to use.
62
62
4. Enter a **Limit**. A limit is the maximum number of results to return, for example, 50.
63
63
5. Select the **Time period** from which you want to query, for example, the previous 12 hours.
@@ -73,47 +73,41 @@ You can filter and view your logs via the Cloudflare dashboard or the API.
73
73
For example, to find an HTTP request with a specific [Ray ID](/fundamentals/reference/cloudflare-ray-id/), go to **Custom SQL**, and enter the following SQL query:
74
74
75
75
```sql
76
-
SELECT
77
-
clientRequestScheme,
78
-
clientRequestHost,
79
-
clientRequestMethod,
80
-
edgeResponseStatus,
81
-
clientRequestUserAgent
82
-
FROM http_requests
83
-
WHERE RayID ='806c30a3cec56817'
84
-
LIMIT1
76
+
SELECT
77
+
clientRequestScheme,
78
+
clientRequestHost,
79
+
clientRequestMethod,
80
+
edgeResponseStatus,
81
+
clientRequestUserAgent
82
+
FROM http_requests
83
+
WHERE RayID ='806c30a3cec56817'
84
+
LIMIT1
85
85
```
86
86
87
87
88
-
Another example to find Cloudflare Access requests with selected columns from a specific timeframe, you can perform the following SQL query:
88
+
As another example, to find Cloudflare Access requests with selected columns from a specific timeframe you could perform the following SQL query:
After selecting all the fields for your query, you can save it by selecting **Save query**. Provide a name and description to help identify it later. To view your saved and recent queries, select **Queries** — they will appear in a side panel where you can insert a new query, or delete any query.
115
109
116
-
## Integrated with Security Analytics
110
+
## Integration with Security Analytics
117
111
118
112
You can also access the Log Explorer dashboard directly from the [Security Analytics dashboard](/waf/analytics/security-analytics/#logs). When doing so, the filters you applied in Security Analytics will automatically carry over to your query in Log Explorer.
119
113
@@ -124,26 +118,27 @@ You can also access the Log Explorer dashboard directly from the [Security Analy
124
118
All the tables supported by Log Explorer contain a special column called `date`, which helps to narrow down the amount of data that is scanned to respond to your query, resulting in faster query response times. The value of `date` must be in the form of `YYYY-MM-DD`. For example, to query logs that occurred on October 12, 2023, add the following to your `WHERE` clause: `date = '2023-10-12'`. The column supports the standard operators of `<`, `>`, and `=`.
125
119
126
120
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account.
127
-
2. Go to **Log Explorer** > **Log Search** > **Custom SQL**.
121
+
2. Go to **Log Explorer** > **Log Search** > **Custom SQL**.
128
122
3. Enter the following SQL query:
129
123
130
124
```sql
131
-
SELECT
132
-
clientip,
133
-
clientrequesthost,
134
-
clientrequestmethod,
135
-
clientrequesturi,
136
-
edgeendtimestamp,
137
-
edgeresponsestatus,
138
-
originresponsestatus,
139
-
edgestarttimestamp,
140
-
rayid,
141
-
clientcountry,
142
-
clientrequestpath, date
143
-
FROM
144
-
http_requests
145
-
WHERE
146
-
date='2023-10-12'LIMIT500
125
+
SELECT
126
+
clientip,
127
+
clientrequesthost,
128
+
clientrequestmethod,
129
+
clientrequesturi,
130
+
edgeendtimestamp,
131
+
edgeresponsestatus,
132
+
originresponsestatus,
133
+
edgestarttimestamp,
134
+
rayid,
135
+
clientcountry,
136
+
clientrequestpath,
137
+
date
138
+
FROM
139
+
http_requests
140
+
WHERE
141
+
date='2023-10-12'LIMIT500
147
142
```
148
143
149
144
### Additional query optimization tips
@@ -156,7 +151,7 @@ All the tables supported by Log Explorer contain a special column called `date`,
156
151
157
152
### Which fields (or columns) are available for querying?
158
153
159
-
All fields listed in the datasets [Log Fields](/logs/reference/log-fields/) are viewable in Log Explorer. For filtering, only fields with simple values, such as those of type `bool`, `int`, `float`, or `string` are supported. Fields with key-value pairs are currently not supported. For example, you cannot use the fields `RequestHeaders` and `Cookies` from the HTTP requests dataset in a filter.
154
+
All fields listed in [Log Fields](/logs/reference/log-fields/) for the [supported datasets](/log-explorer/manage-datasets/#supported-datasets) are viewable in Log Explorer. For filtering, only fields with simple values, such as those of type `bool`, `int`, `float`, or `string` are supported. Fields with key-value pairs are currently not supported. For example, you cannot use the fields `RequestHeaders` and `Cookies` from the HTTP requests dataset in a filter.
In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API.
@@ -30,20 +29,19 @@ In order for Log Explorer to begin storing logs, you need to enable the desired
30
29
It may take a few minutes for the logs to become available for querying.
31
30
:::
32
31
33
-
Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.
32
+
If you are using the API, Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.
34
33
35
-
The following curl command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
34
+
The following `curl` command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.
0 commit comments