Skip to content
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions src/content/docs/logs/R2-log-retrieval.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Logs Engine gives you the ability to store your logs in R2 and query them direct

:::note

Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/logs/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/).
Logs Engine is going to be replaced by Log Explorer. For further details, consult the [Log Explorer](/logs/log-explorer/) documentation and to request access, complete the [sign-up form](https://cloudflare.com/lp/log-explorer/).
:::

## Store logs in R2
Expand Down Expand Up @@ -81,7 +81,7 @@ Stream logs stored in R2 that match the provided query parameters, using the end
```bash
curl --globoff "https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/retrieve?start=2022-06-01T16:00:00Z&end=2022-06-01T16:05:00Z&bucket=cloudflare-logs&prefix=http_requests/example.com/{DATE}" \
--header "X-Auth-Email: <EMAIL>" \
--header "X-Auth-Key: <API_KEY>" \
--header "X-Auth-Key: <API_KEY>" \
--header "R2-Access-Key-Id: R2_ACCESS_KEY_ID" \
--header "R2-Secret-Access-Key: R2_SECRET_ACCESS_KEY"
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ query
datetime_gt:"2022-08-15T00:00:00Z",
destinationType:"s3",
status_neq:200
},
},
limit:10)
{
count,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -183,10 +183,10 @@ Response:

:::note[Note]

The Datadog destination is exclusive to new jobs and might not be backward compatible with older jobs. Create new jobs if you expect to send your logs directly to Datadog instead of modifying already existing ones. If you try to modify an existing job for another destination to push logs to Datadog, you may observe errors.
The Datadog destination is exclusive to new jobs and might not be backward compatible with older jobs. Create new jobs if you expect to send your logs directly to Datadog instead of modifying already existing ones. If you try to modify an existing job for another destination to push logs to Datadog, you may observe errors.
:::

:::note[Note]

To analyze and visualize Cloudflare metrics using the Cloudflare Integration tile for Datadog, follow the steps in the [Datadog Analytics integration page](/analytics/analytics-integrations/datadog/).
To analyze and visualize Cloudflare metrics using the Cloudflare Integration tile for Datadog, follow the steps in the [Datadog Analytics integration page](/analytics/analytics-integrations/datadog/).
:::
Original file line number Diff line number Diff line change
Expand Up @@ -57,5 +57,5 @@ To enable Logpush to GCS:

:::note[Note]

To analyze your Cloudflare Logs data using the Google Cloud Platform (GCP), follow the steps in the [Google Cloud Analytics integration page](/analytics/analytics-integrations/google-cloud/).
To analyze your Cloudflare Logs data using the Google Cloud Platform (GCP), follow the steps in the [Google Cloud Analytics integration page](/analytics/analytics-integrations/google-cloud/).
:::
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Cloudflare Logpush now supports the ability to send logs to configurable HTTP en
Note that when using Logpush to HTTP endpoints, Cloudflare customers are expected to perform their own authentication of the pushed logs. For example, customers may specify a secret token in the URL or an HTTP header of the Logpush destination.

:::note[Endpoint requirements]
Cloudflare expects that the endpoint is available over HTTPS, using a trusted certificate. The endpoint must accept `POST` requests.
Cloudflare expects that the endpoint is available over HTTPS, using a trusted certificate. The endpoint must accept `POST` requests.
:::

## Manage via the Cloudflare dashboard
Expand Down Expand Up @@ -62,7 +62,7 @@ The supported parameters are as follows:

:::note[Note]

The `ownership_challenge` parameter is not required to create a Logpush job to an HTTP endpoint. You need to make sure that the file upload to validate the destination accepts a gzipped `test.txt.gz` with content as `{"content":"tests"}` compressed, otherwise it will return an error, like `error validating destination: error writing object: error uploading`.
The `ownership_challenge` parameter is not required to create a Logpush job to an HTTP endpoint. You need to make sure that the file upload to validate the destination accepts a gzipped `test.txt.gz` with content as `{"content":"tests"}` compressed, otherwise it will return an error, like `error validating destination: error writing object: error uploading`.

:::

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ head:

---

Cloudflare Logpush supports pushing logs directly to IBM Cloud Logs via API. The dashboard functionality will later be added.
Cloudflare Logpush supports pushing logs directly to IBM Cloud Logs via API. The dashboard functionality will later be added.

## Manage via API

Expand All @@ -25,7 +25,7 @@ Ensure Log Share permissions are enabled, before attempting to read or configure
### 1. Create a job

To create a job, make a `POST` request to the Logpush jobs endpoint with the following fields:

- **name** (optional) - Use your domain name as the job name.
- **output_options** (optional) - This parameter is used to define the desired output format and structure. Below are the configurable fields:
- output_type
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ To create a job, make a `POST` request to the Logpush jobs endpoint with the fol

:::note[Note]

To query Cloudflare logs, New Relic requires fields to be sent as a Unix Timestamp.
To query Cloudflare logs, New Relic requires fields to be sent as a UNIX Timestamp.

:::

Expand Down Expand Up @@ -176,6 +176,6 @@ Response:

:::note[Note]

To analyze and visualize Cloudflare metrics using the Cloudflare Network Logs quickstart, follow the steps in the [New Relic Analytics integration page](/analytics/analytics-integrations/new-relic/).
To analyze and visualize Cloudflare metrics using the Cloudflare Network Logs quickstart, follow the steps in the [New Relic Analytics integration page](/analytics/analytics-integrations/new-relic/).

:::
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ To create a job, make a `POST` request to the Logpush jobs endpoint with the fol

:::note[Note]

We recommend adding the `{DATE}` parameter in the `destination_conf` to separate your logs into daily subfolders.
We recommend adding the `{DATE}` parameter in the `destination_conf` to separate your logs into daily subfolders.
:::

```bash
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logpush/jobs \
"output_options": {
"field_names": ["ClientIP", "ClientIP", "ClientRequestHost", "ClientRequestMethod", "ClientRequestURI","EdgeEndTimestamp", "EdgeResponseBytes", "EdgeResponseStatus", "EdgeStartTimestamp", "RayID"],
"timestamp_format": "rfc3339"
},
},
"dataset": "http_requests"
}'
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,5 +56,5 @@ To enable Logpush to Sumo Logic:

* Sumo Logic may impose throttling and caps on your log ingestion to prevent your account from using **On-Demand Capacity**. Refer to [manage ingestion](https://help.sumologic.com/docs/manage/ingestion-volume/log-ingestion/).

* To analyze and visualize Cloudflare Logs using the Cloudflare App for Sumo Logic, follow the steps in the Sumo Logic integration documentation to [install the Cloudflare App](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#installing-the-cloudflare-app) and [view the Cloudflare dashboards](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#viewing-the-cloudflare-dashboards).
* To analyze and visualize Cloudflare Logs using the Cloudflare App for Sumo Logic, follow the steps in the Sumo Logic integration documentation to [install the Cloudflare App](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#installing-the-cloudflare-app) and [view the Cloudflare dashboards](https://help.sumologic.com/docs/integrations/saas-cloud/cloudflare/#viewing-the-cloudflare-dashboards).
:::
6 changes: 3 additions & 3 deletions src/content/docs/logs/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import { CardGrid, Description, Feature, LinkTitleCard, RelatedProduct } from "~

<Description>

Detailed logs that contain metadata generated by our products.
Detailed logs that contain metadata generated by our products.
</Description>

These logs are helpful for debugging, identifying configuration adjustments, and creating analytics, especially when combined with logs from other sources, such as your application server. For information about the types of data Cloudflare collects, refer to [Cloudflare's Types of analytics](/analytics/types-of-analytics/).
Expand Down Expand Up @@ -48,11 +48,11 @@ Use Logs Engine to store your logs in R2 and query them directly.
## Related products

<RelatedProduct header="Audit Logs" href="/fundamentals/setup/account/account-security/review-audit-logs/" product="fundamentals">
Summarize the history of changes made within your Cloudflare account.
Summarize the history of changes made within your Cloudflare account.
</RelatedProduct>

<RelatedProduct header="Web Analytics" href="/web-analytics/" product="analytics">
Provides privacy-first analytics without changing your DNS or using Cloudflare's proxy.
Provides privacy-first analytics without changing your DNS or using Cloudflare's proxy.
</RelatedProduct>

***
Expand Down
6 changes: 3 additions & 3 deletions src/content/docs/logs/log-explorer.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Note that these permissions exist at the account and zone level and you need the

Authentication with the API can be done via an authentication header or API token. Append your API call with either of the following additional parameters.

- **Authentication header**
- **Authentication header**

- `X-Auth-Email` - the Cloudflare account email address associated with the domain
- `X-Auth-Key` - the Cloudflare API key
Expand All @@ -54,11 +54,11 @@ Authentication with the API can be done via an authentication header or API toke

- `Authorization: Bearer <API_TOKEN>` To create an appropriately scoped API token, refer to [Create API token](/fundamentals/api/get-started/create-token/) documentation. Copy and paste the token into the authorization parameter for your API call.



## Enable Log Explorer

In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API.
In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API.

<Tabs syncKey="dashPlusAPI"> <TabItem label="Dashboard">

Expand Down