Skip to content

Commit bc2a07f

Browse files
authored
Improve documentation of destination_conf parameter (cloudflare#17631)
* Improve documentation of destination_conf parameter This commit sheds more light on destination_conf attribute of logpush job configuration by adding details on the format it has and stresses more details on accepted schemes such as r2, sumo, etc. * Improve description of {DATE} placeholder This commit provides a bit more details on `{DATE}` and restructures examples.
1 parent a0a8b17 commit bc2a07f

File tree

1 file changed

+41
-10
lines changed

1 file changed

+41
-10
lines changed

src/content/docs/logs/get-started/api-configuration.mdx

Lines changed: 41 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -93,16 +93,47 @@ You can specify your cloud service provider destination via the required **desti
9393
As of May 2022, defining a unique destination for a Logpush job will no longer be required. As this constraint has been removed, you can now have more than one job writing to the same destination.
9494
:::
9595

96-
* **Cloudflare R2**: bucket path + account ID + R2 access key ID + R2 secret access key; for example: `r2://<BUCKET_PATH>?account-id=<ACCOUNT_ID>&access-key-id=<R2_ACCESS_KEY_ID>&secret-access-key=<R2_SECRET_ACCESS_KEY>`
97-
* **AWS S3**: bucket + optional directory + region + optional encryption parameter (if required by your policy); for example: `s3://bucket/[dir]?region=<REGION>[&sse=AES256]`
98-
* **Datadog**: Datadog endpoint URL + Datadog API key + optional parameters; for example: `datadog://<DATADOG_ENDPOINT_URL>?header_DD-API-KEY=<DATADOG_API_KEY>&ddsource=cloudflare&service=<SERVICE>&host=<HOST>&ddtags=<TAGS>`
99-
* **Google Cloud Storage**: bucket + optional directory; for example: `gs://bucket/[dir]`
100-
* **Microsoft Azure**: service-level SAS URL with `https` replaced by `azure` + optional directory added before query string; for example: `azure://<BLOB_CONTAINER_PATH>/[dir]?<QUERY_STRING>`
101-
* **New Relic** New Relic endpoint URL which is `https://log-api.newrelic.com/log/v1` for US or `https://log-api.eu.newrelic.com/log/v1` for EU + a license key + a format; for example: for US `"https://log-api.newrelic.com/log/v1?Api-Key=<NR_LICENSE_KEY>&format=cloudflare"` and for EU `"https://log-api.eu.newrelic.com/log/v1?Api-Key=<NR_LICENSE_KEY>&format=cloudflare"`
102-
* **Splunk**: Splunk endpoint URL + Splunk channel ID + insecure-skip-verify flag + Splunk sourcetype + Splunk authorization token; for example: `splunk://<SPLUNK_ENDPOINT_URL>?channel=<SPLUNK_CHANNEL_ID>&insecure-skip-verify=<INSECURE_SKIP_VERIFY>&sourcetype=<SOURCE_TYPE>&header_Authorization=<SPLUNK_AUTH_TOKEN>`
103-
* **Sumo Logic**: HTTP source address URL with `https` replaced by `sumo`; for example: `sumo://<SUMO_ENDPOINT_URL>/receiver/v1/http/<UNIQUE_HTTP_COLLECTOR_CODE>`
104-
105-
For R2, S3, Google Cloud Storage, and Azure, logs can be separated into daily subdirectories by using the special string `{DATE}` in the URL path; for example: `s3://mybucket/logs/{DATE}?region=us-east-1&sse=AES256` or `azure://myblobcontainer/logs/{DATE}?[QueryString]`. It will be substituted with the date in `YYYYMMDD` format, like `20180523`.
96+
The `destination_conf` parameter must follow this format:
97+
98+
```
99+
<scheme>://<destination-address>
100+
```
101+
102+
Supported schemes are listed below, each tailored to specific providers such as
103+
R2, S3, etc. Additionally, generic use cases like `https` are also covered:
104+
105+
* `r2`,
106+
* `gs`,
107+
* `s3`,
108+
* `cos`,
109+
* `sumo`,
110+
* `https`,
111+
* `azure`,
112+
* `splunk`,
113+
* `datadog`.
114+
115+
The `destination-address` should generally be provided by the destination
116+
provider. However, for certain providers, we require the `destination-address`
117+
to follow a specific format:
118+
119+
* **Cloudflare R2** (scheme `r2`): bucket path + account ID + R2 access key ID + R2 secret access key; for example: `r2://<BUCKET_PATH>?account-id=<ACCOUNT_ID>&access-key-id=<R2_ACCESS_KEY_ID>&secret-access-key=<R2_SECRET_ACCESS_KEY>`
120+
* **AWS S3** (scheme `s3`): bucket + optional directory + region + optional encryption parameter (if required by your policy); for example: `s3://bucket/[dir]?region=<REGION>[&sse=AES256]`
121+
* **Datadog** (scheme `datadog`): Datadog endpoint URL + Datadog API key + optional parameters; for example: `datadog://<DATADOG_ENDPOINT_URL>?header_DD-API-KEY=<DATADOG_API_KEY>&ddsource=cloudflare&service=<SERVICE>&host=<HOST>&ddtags=<TAGS>`
122+
* **Google Cloud Storage** (scheme `gs`): bucket + optional directory; for example: `gs://bucket/[dir]`
123+
* **Microsoft Azure** (scheme `azure`): service-level SAS URL with `https` replaced by `azure` + optional directory added before query string; for example: `azure://<BLOB_CONTAINER_PATH>/[dir]?<QUERY_STRING>`
124+
* **New Relic** (use scheme `https`): New Relic endpoint URL which is `https://log-api.newrelic.com/log/v1` for US or `https://log-api.eu.newrelic.com/log/v1` for EU + a license key + a format; for example: for US `"https://log-api.newrelic.com/log/v1?Api-Key=<NR_LICENSE_KEY>&format=cloudflare"` and for EU `"https://log-api.eu.newrelic.com/log/v1?Api-Key=<NR_LICENSE_KEY>&format=cloudflare"`
125+
* **Splunk** (scheme `splunk`): Splunk endpoint URL + Splunk channel ID + insecure-skip-verify flag + Splunk sourcetype + Splunk authorization token; for example: `splunk://<SPLUNK_ENDPOINT_URL>?channel=<SPLUNK_CHANNEL_ID>&insecure-skip-verify=<INSECURE_SKIP_VERIFY>&sourcetype=<SOURCE_TYPE>&header_Authorization=<SPLUNK_AUTH_TOKEN>`
126+
* **Sumo Logic** (scheme `sumo`): HTTP source address URL with `https` replaced by `sumo`; for example: `sumo://<SUMO_ENDPOINT_URL>/receiver/v1/http/<UNIQUE_HTTP_COLLECTOR_CODE>`
127+
128+
For **R2**, **S3**, **Google Cloud Storage**, and **Azure**, you can organize logs into daily subdirectories by including the special placeholder `{DATE}` in the URL path. This placeholder will automatically be replaced with the date in the `YYYYMMDD` format (for example, `20180523`).
129+
130+
For example:
131+
132+
* `s3://mybucket/logs/{DATE}?region=us-east-1&sse=AES256`
133+
* `azure://myblobcontainer/logs/{DATE}?[QueryString]`
134+
135+
This approach is useful when you want your logs grouped by day.
136+
106137

107138
For more information on the value for your cloud storage provider, consult the following conventions:
108139

0 commit comments

Comments
 (0)