Skip to content

Commit 0ee73ab

Browse files
authored
Merge pull request #1931 from fluent/lynettemiles/sc-136235/update-fluent-bit-docs-pipeline-outputs-datadog
2 parents cf53d48 + 0ef692e commit 0ee73ab

File tree

2 files changed

+28
-26
lines changed

2 files changed

+28
-26
lines changed

pipeline/outputs/datadog.md

Lines changed: 26 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -4,41 +4,43 @@ description: Send logs to Datadog
44

55
# Datadog
66

7-
The Datadog output plugin allows to ingest your logs into [Datadog](https://app.datadoghq.com/signup).
7+
The _Datadog_ output plugin lets you ingest your logs into [Datadog](https://app.datadoghq.com/signup).
88

99
Before you begin, you need a [Datadog account](https://app.datadoghq.com/signup), a [Datadog API key](https://docs.datadoghq.com/account_management/api-app-keys/), and you need to [activate Datadog Logs Management](https://app.datadoghq.com/logs/activation).
1010

11-
## Configuration Parameters
12-
13-
| Key | Description | Default |
14-
| --------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------- |
15-
| Host | _Required_ - The Datadog server where you are sending your logs. | `http-intake.logs.datadoghq.com` |
16-
| TLS | _Required_ - End-to-end security communications security protocol. Datadog recommends setting this to `on`. | `off` |
17-
| compress | _Recommended_ - compresses the payload in GZIP format, Datadog supports and recommends setting this to `gzip`. | |
18-
| apikey | _Required_ - Your [Datadog API key](https://app.datadoghq.com/account/settings#api). | |
19-
| Proxy | _Optional_ - Specify an HTTP Proxy. The expected format of this value is [http://host:port](http://host/:port). Note that _https_ is **not** supported yet. | |
20-
| provider | To activate the remapping, specify configuration flag provider with value `ecs`. | |
21-
| json_date_key | Date key name for output. | `timestamp` |
22-
| include_tag_key | If enabled, a tag is appended to output. The key name is used `tag_key` property. | `false` |
23-
| tag_key | The key name of tag. If `include_tag_key` is false, This property is ignored. | `tagkey` |
24-
| dd_service | _Recommended_ - The human readable name for your service generating the logs (e.g. the name of your application or database). If unset, Datadog will look for the service using [Service Remapper](https://docs.datadoghq.com/logs/log_configuration/pipelines/?tab=service#service-attribute)." | |
25-
| dd_source | _Recommended_ - A human readable name for the underlying technology of your service (e.g. `postgres` or `nginx`). If unset, Datadog will look for the source in the [`ddsource` attribute](https://docs.datadoghq.com/logs/log_configuration/pipelines/?tab=source#source-attribute). | |
26-
| dd_tags | _Optional_ - The [tags](https://docs.datadoghq.com/tagging/) you want to assign to your logs in Datadog. If unset, Datadog will look for the tags in the [`ddtags` attribute](https://docs.datadoghq.com/api/latest/logs/#send-logs). | |
27-
| dd_message_key | By default, the plugin searches for the key 'log' and remap the value to the key 'message'. If the property is set, the plugin will search the property name key. | |
28-
| dd_hostname | The host the emitted logs should be associated with. If unset, Datadog expects the host to be set with `host`, `hostname`, or `syslog.hostname` attributes. See [Datadog Logs preprocessor documentation](https://docs.datadoghq.com/logs/log_configuration/pipelines/?tab=host#preprocessing) for recognized attributes. | _none_ |
11+
## Configuration parameters
12+
13+
This plugin uses the following configuration parameters:
14+
15+
| Key | Description | Default |
16+
| --- | ----------- | ------- |
17+
| `Host` | The Datadog server where you are sending your logs. | `http-intake.logs.datadoghq.com` |
18+
| `TLS` | End-to-end security communications security protocol. Datadog recommends setting this to `on`. | `off` |
19+
| `compress` | Optional. Compresses the payload in GZIP format. Datadog supports and recommends setting this to `gzip`. | _none_ |
20+
| `apikey` | Your [Datadog API key](https://app.datadoghq.com/account/settings#api). | _none_ |
21+
| `Proxy` | Optional. Specifies an HTTP proxy. The expected format of this value is `http://host:port`. HTTPS isn't supported. | _none_ |
22+
| `provider` | To activate remapping, specify the configuration flag provider with the value `ecs`. | _none_ |
23+
| `json_date_key` | Date key name for output. | `timestamp` |
24+
| `include_tag_key` | If enabled, a tag is appended to the output. The key name is used `tag_key` property. | `false` |
25+
| `tag_key` | The key name of tag. If `include_tag_key` is `false`, this property is ignored. | `tagkey` |
26+
| `dd_service` | Recommended. The human readable name for your service generating the logs. For example, the name of your application or database. If not set, Datadog looks for the service using [service remapper](https://docs.datadoghq.com/logs/log_configuration/pipelines/?tab=service#service-attribute). | _none_ |
27+
| `dd_source` | Recommended. A human-readable name for the underlying technology of your service like `postgres` or `nginx`. If unset, Datadog looks for the source in the [`ddsource` attribute](https://docs.datadoghq.com/logs/log_configuration/pipelines/?tab=source#source-attribute). | _none_ |
28+
| `dd_tags` | Optional. The [tags](https://docs.datadoghq.com/tagging/) you want to assign to your logs in Datadog. If unset, Datadog will look for the tags in the [`ddtags` attribute](https://docs.datadoghq.com/api/latest/logs/#send-logs). | _none_ |
29+
| `dd_message_key` | By default, the plugin searches for the key `log` and remaps the value to the key `message`. If the property is set, the plugin will search the property name key. | _none_ |
30+
| `dd_hostname` | The host the emitted logs should be associated with. If unset, Datadog expects the host to be set with `host`, `hostname`, or `syslog.hostname` attributes. See [Datadog Logs preprocessor documentation](https://docs.datadoghq.com/logs/log_configuration/pipelines/?tab=host#preprocessing) for recognized attributes. | _none_ |
2931
| workers | The number of [workers](../../administration/multithreading.md#outputs) to perform flush operations for this output. | `0` |
3032
| header | Add additional arbitrary HTTP header key/value pair. Multiple headers can be set. | _none_ |
3133

32-
### Configuration File
34+
### Configuration file
3335

34-
Get started quickly with this configuration file:
36+
Get started with this configuration file:
3537

3638
{% tabs %}
3739
{% tab title="fluent-bit.yaml" %}
3840

3941
```yaml
4042
pipeline:
41-
43+
4244
outputs:
4345
- name: datadog
4446
match: '*'
@@ -49,7 +51,7 @@ pipeline:
4951
dd_service: <my-app-service>
5052
dd_source: <my-app-source>
5153
dd_tags: team:logs,foo:bar
52-
dd_hostname: myhost
54+
dd_hostname: myhost
5355
```
5456
5557
{% endtab %}
@@ -74,6 +76,4 @@ pipeline:
7476

7577
## Troubleshooting
7678

77-
### 403 Forbidden
78-
79-
If you get a `403 Forbidden` error response, double check that you have a valid [Datadog API key](https://docs.datadoghq.com/account_management/api-app-keys/) and that you have [activated Datadog Logs Management](https://app.datadoghq.com/logs/activation).
79+
If you get a `403 Forbidden` error response, double check that you have a valid [Datadog API key](https://docs.datadoghq.com/account_management/api-app-keys/) and that you have [activated Datadog Logs Management](https://app.datadoghq.com/logs/activation).

vale-styles/FluentBit/Spelling-exceptions.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -134,6 +134,7 @@ Podman
134134
Postgres
135135
PowerShell
136136
prepopulate
137+
preprocessor
137138
Profiler
138139
Prometheus
139140
PromQL
@@ -148,6 +149,7 @@ Queryable
148149
Raspbian
149150
rdkafka
150151
Redpanda
152+
remapper
151153
reindexed
152154
rollup
153155
Rollup

0 commit comments

Comments
 (0)