Skip to content

Commit 6be3a70

Browse files
vil02hyperlint-ai[bot]pedrosousa
authored
[Logs] Remove trailing spaces (#20304)
* Also, use 'UNIX' instead of 'Unix' --------- Co-authored-by: hyperlint-ai[bot] <154288675+hyperlint-ai[bot]@users.noreply.github.com> Co-authored-by: Pedro Sousa <[email protected]>
1 parent 78c7db3 commit 6be3a70

File tree

12 files changed

+21
-21
lines changed

12 files changed

+21
-21
lines changed

src/content/docs/logs/R2-log-retrieval.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ Logs Engine gives you the ability to store your logs in R2 and query them direct
1212

1313
:::note
1414

15-
Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/logs/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/).
15+
Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/logs/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/).
1616
:::
1717

1818
## Store logs in R2
@@ -81,7 +81,7 @@ Stream logs stored in R2 that match the provided query parameters, using the end
8181
```bash
8282
curl --globoff "https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/retrieve?start=2022-06-01T16:00:00Z&end=2022-06-01T16:05:00Z&bucket=cloudflare-logs&prefix=http_requests/example.com/{DATE}" \
8383
--header "X-Auth-Email: <EMAIL>" \
84-
--header "X-Auth-Key: <API_KEY>" \
84+
--header "X-Auth-Key: <API_KEY>" \
8585
--header "R2-Access-Key-Id: R2_ACCESS_KEY_ID" \
8686
--header "R2-Secret-Access-Key: R2_SECRET_ACCESS_KEY"
8787
```

src/content/docs/logs/get-started/alerts-and-analytics.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ query
4646
datetime_gt:"2022-08-15T00:00:00Z",
4747
destinationType:"s3",
4848
status_neq:200
49-
},
49+
},
5050
limit:10)
5151
{
5252
count,

src/content/docs/logs/get-started/enable-destinations/datadog.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -183,10 +183,10 @@ Response:
183183

184184
:::note[Note]
185185

186-
The Datadog destination is exclusive to new jobs and might not be backward compatible with older jobs. Create new jobs if you expect to send your logs directly to Datadog instead of modifying already existing ones. If you try to modify an existing job for another destination to push logs to Datadog, you may observe errors.
186+
The Datadog destination is exclusive to new jobs and might not be backward compatible with older jobs. Create new jobs if you expect to send your logs directly to Datadog instead of modifying already existing ones. If you try to modify an existing job for another destination to push logs to Datadog, you may observe errors.
187187
:::
188188

189189
:::note[Note]
190190

191-
To analyze and visualize Cloudflare metrics using the Cloudflare Integration tile for Datadog, follow the steps in the [Datadog Analytics integration page](/analytics/analytics-integrations/datadog/).
191+
To analyze and visualize Cloudflare metrics using the Cloudflare Integration tile for Datadog, follow the steps in the [Datadog Analytics integration page](/analytics/analytics-integrations/datadog/).
192192
:::

src/content/docs/logs/get-started/enable-destinations/google-cloud-storage.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -57,5 +57,5 @@ To enable Logpush to GCS:
5757

5858
:::note[Note]
5959

60-
To analyze your Cloudflare Logs data using the Google Cloud Platform (GCP), follow the steps in the [Google Cloud Analytics integration page](/analytics/analytics-integrations/google-cloud/).
60+
To analyze your Cloudflare Logs data using the Google Cloud Platform (GCP), follow the steps in the [Google Cloud Analytics integration page](/analytics/analytics-integrations/google-cloud/).
6161
:::

src/content/docs/logs/get-started/enable-destinations/http.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Cloudflare Logpush now supports the ability to send logs to configurable HTTP en
1313
Note that when using Logpush to HTTP endpoints, Cloudflare customers are expected to perform their own authentication of the pushed logs. For example, customers may specify a secret token in the URL or an HTTP header of the Logpush destination.
1414

1515
:::note[Endpoint requirements]
16-
Cloudflare expects that the endpoint is available over HTTPS, using a trusted certificate. The endpoint must accept `POST` requests.
16+
Cloudflare expects that the endpoint is available over HTTPS, using a trusted certificate. The endpoint must accept `POST` requests.
1717
:::
1818

1919
## Manage via the Cloudflare dashboard
@@ -62,7 +62,7 @@ The supported parameters are as follows:
6262

6363
:::note[Note]
6464

65-
The `ownership_challenge` parameter is not required to create a Logpush job to an HTTP endpoint. You need to make sure that the file upload to validate the destination accepts a gzipped `test.txt.gz` with content as `{"content":"tests"}` compressed, otherwise it will return an error, like `error validating destination: error writing object: error uploading`.
65+
The `ownership_challenge` parameter is not required to create a Logpush job to an HTTP endpoint. You need to make sure that the file upload to validate the destination accepts a gzipped `test.txt.gz` with content as `{"content":"tests"}` compressed, otherwise it will return an error, like `error validating destination: error writing object: error uploading`.
6666

6767
:::
6868

src/content/docs/logs/get-started/enable-destinations/ibm-cloud-logs.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ head:
99

1010
---
1111

12-
Cloudflare Logpush supports pushing logs directly to IBM Cloud Logs via API. The dashboard functionality will later be added.
12+
Cloudflare Logpush supports pushing logs directly to IBM Cloud Logs via API. The dashboard functionality will later be added.
1313

1414
## Manage via API
1515

@@ -25,7 +25,7 @@ Ensure Log Share permissions are enabled, before attempting to read or configure
2525
### 1. Create a job
2626

2727
To create a job, make a `POST` request to the Logpush jobs endpoint with the following fields:
28-
28+
2929
- **name** (optional) - Use your domain name as the job name.
3030
- **output_options** (optional) - This parameter is used to define the desired output format and structure. Below are the configurable fields:
3131
- output_type

src/content/docs/logs/get-started/enable-destinations/new-relic.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ To create a job, make a `POST` request to the Logpush jobs endpoint with the fol
6363

6464
:::note[Note]
6565

66-
To query Cloudflare logs, New Relic requires fields to be sent as a Unix Timestamp.
66+
To query Cloudflare logs, New Relic requires fields to be sent as a UNIX timestamp.
6767

6868
:::
6969

@@ -176,6 +176,6 @@ Response:
176176

177177
:::note[Note]
178178

179-
To analyze and visualize Cloudflare metrics using the Cloudflare Network Logs quickstart, follow the steps in the [New Relic Analytics integration page](/analytics/analytics-integrations/new-relic/).
179+
To analyze and visualize Cloudflare metrics using the Cloudflare Network Logs quickstart, follow the steps in the [New Relic Analytics integration page](/analytics/analytics-integrations/new-relic/).
180180

181181
:::

src/content/docs/logs/get-started/enable-destinations/r2.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ To create a job, make a `POST` request to the Logpush jobs endpoint with the fol
6767

6868
:::note[Note]
6969

70-
We recommend adding the `{DATE}` parameter in the `destination_conf` to separate your logs into daily subfolders.
70+
We recommend adding the `{DATE}` parameter in the `destination_conf` to separate your logs into daily subfolders.
7171
:::
7272

7373
```bash

src/content/docs/logs/get-started/enable-destinations/s3-compatible-endpoints.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -107,7 +107,7 @@ curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logpush/jobs \
107107
"output_options": {
108108
"field_names": ["ClientIP", "ClientIP", "ClientRequestHost", "ClientRequestMethod", "ClientRequestURI","EdgeEndTimestamp", "EdgeResponseBytes", "EdgeResponseStatus", "EdgeStartTimestamp", "RayID"],
109109
"timestamp_format": "rfc3339"
110-
},
110+
},
111111
"dataset": "http_requests"
112112
}'
113113
```

src/content/docs/logs/get-started/enable-destinations/sumo-logic.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,5 +56,5 @@ To enable Logpush to Sumo Logic:
5656

5757
* Sumo Logic may impose throttling and caps on your log ingestion to prevent your account from using **On-Demand Capacity**. Refer to [manage ingestion](https://help.sumologic.com/docs/manage/ingestion-volume/log-ingestion/).
5858

59-
* To analyze and visualize Cloudflare Logs using the Cloudflare App for Sumo Logic, follow the steps in the Sumo Logic integration documentation to [install the Cloudflare App](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#installing-the-cloudflare-app) and [view the Cloudflare dashboards](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#viewing-the-cloudflare-dashboards).
59+
* To analyze and visualize Cloudflare Logs using the Cloudflare App for Sumo Logic, follow the steps in the Sumo Logic integration documentation to [install the Cloudflare App](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#installing-the-cloudflare-app) and [view the Cloudflare dashboards](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#viewing-the-cloudflare-dashboards).
6060
:::

0 commit comments

Comments
 (0)