diff --git a/src/content/docs/logs/R2-log-retrieval.mdx b/src/content/docs/logs/R2-log-retrieval.mdx index 0fec688e096c17..3e2406916fa655 100644 --- a/src/content/docs/logs/R2-log-retrieval.mdx +++ b/src/content/docs/logs/R2-log-retrieval.mdx @@ -12,7 +12,7 @@ Logs Engine gives you the ability to store your logs in R2 and query them direct :::note -Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/logs/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/). +Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/logs/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/). ::: ## Store logs in R2 @@ -81,7 +81,7 @@ Stream logs stored in R2 that match the provided query parameters, using the end ```bash curl --globoff "https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/retrieve?start=2022-06-01T16:00:00Z&end=2022-06-01T16:05:00Z&bucket=cloudflare-logs&prefix=http_requests/example.com/{DATE}" \ --header "X-Auth-Email: " \ ---header "X-Auth-Key: " \ +--header "X-Auth-Key: " \ --header "R2-Access-Key-Id: R2_ACCESS_KEY_ID" \ --header "R2-Secret-Access-Key: R2_SECRET_ACCESS_KEY" ``` diff --git a/src/content/docs/logs/get-started/alerts-and-analytics.mdx b/src/content/docs/logs/get-started/alerts-and-analytics.mdx index 921cb4e69306bc..4a632a685af535 100644 --- a/src/content/docs/logs/get-started/alerts-and-analytics.mdx +++ b/src/content/docs/logs/get-started/alerts-and-analytics.mdx @@ -46,7 +46,7 @@ query datetime_gt:"2022-08-15T00:00:00Z", destinationType:"s3", status_neq:200 - }, + }, limit:10) { count, diff --git a/src/content/docs/logs/get-started/enable-destinations/datadog.mdx b/src/content/docs/logs/get-started/enable-destinations/datadog.mdx index 11e3e84baa6fff..6921d27c6e294f 100644 --- a/src/content/docs/logs/get-started/enable-destinations/datadog.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/datadog.mdx @@ -183,10 +183,10 @@ Response: :::note[Note] -The Datadog destination is exclusive to new jobs and might not be backward compatible with older jobs. Create new jobs if you expect to send your logs directly to Datadog instead of modifying already existing ones. If you try to modify an existing job for another destination to push logs to Datadog, you may observe errors. +The Datadog destination is exclusive to new jobs and might not be backward compatible with older jobs. Create new jobs if you expect to send your logs directly to Datadog instead of modifying already existing ones. If you try to modify an existing job for another destination to push logs to Datadog, you may observe errors. ::: :::note[Note] -To analyze and visualize Cloudflare metrics using the Cloudflare Integration tile for Datadog, follow the steps in the [Datadog Analytics integration page](/analytics/analytics-integrations/datadog/). +To analyze and visualize Cloudflare metrics using the Cloudflare Integration tile for Datadog, follow the steps in the [Datadog Analytics integration page](/analytics/analytics-integrations/datadog/). ::: diff --git a/src/content/docs/logs/get-started/enable-destinations/google-cloud-storage.mdx b/src/content/docs/logs/get-started/enable-destinations/google-cloud-storage.mdx index 2bac0eb864dbc3..2702162a00424d 100644 --- a/src/content/docs/logs/get-started/enable-destinations/google-cloud-storage.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/google-cloud-storage.mdx @@ -57,5 +57,5 @@ To enable Logpush to GCS: :::note[Note] -To analyze your Cloudflare Logs data using the Google Cloud Platform (GCP), follow the steps in the [Google Cloud Analytics integration page](/analytics/analytics-integrations/google-cloud/). +To analyze your Cloudflare Logs data using the Google Cloud Platform (GCP), follow the steps in the [Google Cloud Analytics integration page](/analytics/analytics-integrations/google-cloud/). ::: diff --git a/src/content/docs/logs/get-started/enable-destinations/http.mdx b/src/content/docs/logs/get-started/enable-destinations/http.mdx index a6dac762a267d5..cc7adabedb2cf0 100644 --- a/src/content/docs/logs/get-started/enable-destinations/http.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/http.mdx @@ -13,7 +13,7 @@ Cloudflare Logpush now supports the ability to send logs to configurable HTTP en Note that when using Logpush to HTTP endpoints, Cloudflare customers are expected to perform their own authentication of the pushed logs. For example, customers may specify a secret token in the URL or an HTTP header of the Logpush destination. :::note[Endpoint requirements] -Cloudflare expects that the endpoint is available over HTTPS, using a trusted certificate. The endpoint must accept `POST` requests. +Cloudflare expects that the endpoint is available over HTTPS, using a trusted certificate. The endpoint must accept `POST` requests. ::: ## Manage via the Cloudflare dashboard @@ -62,7 +62,7 @@ The supported parameters are as follows: :::note[Note] -The `ownership_challenge` parameter is not required to create a Logpush job to an HTTP endpoint. You need to make sure that the file upload to validate the destination accepts a gzipped `test.txt.gz` with content as `{"content":"tests"}` compressed, otherwise it will return an error, like `error validating destination: error writing object: error uploading`. +The `ownership_challenge` parameter is not required to create a Logpush job to an HTTP endpoint. You need to make sure that the file upload to validate the destination accepts a gzipped `test.txt.gz` with content as `{"content":"tests"}` compressed, otherwise it will return an error, like `error validating destination: error writing object: error uploading`. ::: diff --git a/src/content/docs/logs/get-started/enable-destinations/ibm-cloud-logs.mdx b/src/content/docs/logs/get-started/enable-destinations/ibm-cloud-logs.mdx index c113591a9b0738..99a2b53bb1c66f 100644 --- a/src/content/docs/logs/get-started/enable-destinations/ibm-cloud-logs.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/ibm-cloud-logs.mdx @@ -9,7 +9,7 @@ head: --- -Cloudflare Logpush supports pushing logs directly to IBM Cloud Logs via API. The dashboard functionality will later be added. +Cloudflare Logpush supports pushing logs directly to IBM Cloud Logs via API. The dashboard functionality will later be added. ## Manage via API @@ -25,7 +25,7 @@ Ensure Log Share permissions are enabled, before attempting to read or configure ### 1. Create a job To create a job, make a `POST` request to the Logpush jobs endpoint with the following fields: - + - **name** (optional) - Use your domain name as the job name. - **output_options** (optional) - This parameter is used to define the desired output format and structure. Below are the configurable fields: - output_type diff --git a/src/content/docs/logs/get-started/enable-destinations/new-relic.mdx b/src/content/docs/logs/get-started/enable-destinations/new-relic.mdx index 55234d5fd068f4..caae29ebadcff1 100644 --- a/src/content/docs/logs/get-started/enable-destinations/new-relic.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/new-relic.mdx @@ -63,7 +63,7 @@ To create a job, make a `POST` request to the Logpush jobs endpoint with the fol :::note[Note] - To query Cloudflare logs, New Relic requires fields to be sent as a Unix Timestamp. + To query Cloudflare logs, New Relic requires fields to be sent as a UNIX timestamp. ::: @@ -176,6 +176,6 @@ Response: :::note[Note] -To analyze and visualize Cloudflare metrics using the Cloudflare Network Logs quickstart, follow the steps in the [New Relic Analytics integration page](/analytics/analytics-integrations/new-relic/). +To analyze and visualize Cloudflare metrics using the Cloudflare Network Logs quickstart, follow the steps in the [New Relic Analytics integration page](/analytics/analytics-integrations/new-relic/). ::: diff --git a/src/content/docs/logs/get-started/enable-destinations/r2.mdx b/src/content/docs/logs/get-started/enable-destinations/r2.mdx index c7330065287cfd..567086da385de3 100644 --- a/src/content/docs/logs/get-started/enable-destinations/r2.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/r2.mdx @@ -67,7 +67,7 @@ To create a job, make a `POST` request to the Logpush jobs endpoint with the fol :::note[Note] -We recommend adding the `{DATE}` parameter in the `destination_conf` to separate your logs into daily subfolders. +We recommend adding the `{DATE}` parameter in the `destination_conf` to separate your logs into daily subfolders. ::: ```bash diff --git a/src/content/docs/logs/get-started/enable-destinations/s3-compatible-endpoints.mdx b/src/content/docs/logs/get-started/enable-destinations/s3-compatible-endpoints.mdx index 194e64c95c8f8f..b95db315660192 100644 --- a/src/content/docs/logs/get-started/enable-destinations/s3-compatible-endpoints.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/s3-compatible-endpoints.mdx @@ -107,7 +107,7 @@ curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logpush/jobs \ "output_options": { "field_names": ["ClientIP", "ClientIP", "ClientRequestHost", "ClientRequestMethod", "ClientRequestURI","EdgeEndTimestamp", "EdgeResponseBytes", "EdgeResponseStatus", "EdgeStartTimestamp", "RayID"], "timestamp_format": "rfc3339" - }, + }, "dataset": "http_requests" }' ``` diff --git a/src/content/docs/logs/get-started/enable-destinations/sumo-logic.mdx b/src/content/docs/logs/get-started/enable-destinations/sumo-logic.mdx index dfdaebfa0f9d5e..fcd43306cd86e8 100644 --- a/src/content/docs/logs/get-started/enable-destinations/sumo-logic.mdx +++ b/src/content/docs/logs/get-started/enable-destinations/sumo-logic.mdx @@ -56,5 +56,5 @@ To enable Logpush to Sumo Logic: * Sumo Logic may impose throttling and caps on your log ingestion to prevent your account from using **On-Demand Capacity**. Refer to [manage ingestion](https://help.sumologic.com/docs/manage/ingestion-volume/log-ingestion/). -* To analyze and visualize Cloudflare Logs using the Cloudflare App for Sumo Logic, follow the steps in the Sumo Logic integration documentation to [install the Cloudflare App](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#installing-the-cloudflare-app) and [view the Cloudflare dashboards](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#viewing-the-cloudflare-dashboards). +* To analyze and visualize Cloudflare Logs using the Cloudflare App for Sumo Logic, follow the steps in the Sumo Logic integration documentation to [install the Cloudflare App](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#installing-the-cloudflare-app) and [view the Cloudflare dashboards](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#viewing-the-cloudflare-dashboards). ::: diff --git a/src/content/docs/logs/index.mdx b/src/content/docs/logs/index.mdx index effccb934276d3..9581832a93ae07 100644 --- a/src/content/docs/logs/index.mdx +++ b/src/content/docs/logs/index.mdx @@ -13,7 +13,7 @@ import { CardGrid, Description, Feature, LinkTitleCard, RelatedProduct } from "~ -Detailed logs that contain metadata generated by our products. +Detailed logs that contain metadata generated by our products. These logs are helpful for debugging, identifying configuration adjustments, and creating analytics, especially when combined with logs from other sources, such as your application server. For information about the types of data Cloudflare collects, refer to [Cloudflare's Types of analytics](/analytics/types-of-analytics/). @@ -48,11 +48,11 @@ Use Logs Engine to store your logs in R2 and query them directly. ## Related products -Summarize the history of changes made within your Cloudflare account. +Summarize the history of changes made within your Cloudflare account. -Provides privacy-first analytics without changing your DNS or using Cloudflare's proxy. +Provides privacy-first analytics without changing your DNS or using Cloudflare's proxy. *** diff --git a/src/content/docs/logs/log-explorer.mdx b/src/content/docs/logs/log-explorer.mdx index 079132830e3cdd..b75cbc17615931 100644 --- a/src/content/docs/logs/log-explorer.mdx +++ b/src/content/docs/logs/log-explorer.mdx @@ -45,7 +45,7 @@ Note that these permissions exist at the account and zone level and you need the Authentication with the API can be done via an authentication header or API token. Append your API call with either of the following additional parameters. -- **Authentication header** +- **Authentication header** - `X-Auth-Email` - the Cloudflare account email address associated with the domain - `X-Auth-Key` - the Cloudflare API key @@ -54,11 +54,11 @@ Authentication with the API can be done via an authentication header or API toke - `Authorization: Bearer ` To create an appropriately scoped API token, refer to [Create API token](/fundamentals/api/get-started/create-token/) documentation. Copy and paste the token into the authorization parameter for your API call. - + ## Enable Log Explorer -In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API. +In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API.