Skip to content

Commit 427691b

Browse files
alexakreizingerTom
authored andcommitted
Data pipeline: update titles to sentence case + unwrap lines
Signed-off-by: Alexa Kreizinger <[email protected]> Signed-off-by: Tom <[email protected]>
1 parent ed9a970 commit 427691b

37 files changed

+166
-218
lines changed

SUMMARY.md

Lines changed: 47 additions & 48 deletions
Original file line numberDiff line numberDiff line change
@@ -90,95 +90,96 @@
9090

9191
## Data pipeline
9292

93-
* [Pipeline Monitoring](pipeline/pipeline-monitoring.md)
93+
* [Pipeline monitoring](pipeline/pipeline-monitoring.md)
9494
* [Inputs](pipeline/inputs/README.md)
9595
* [Collectd](pipeline/inputs/collectd.md)
96-
* [CPU Log Based Metrics](pipeline/inputs/cpu-metrics.md)
97-
* [Disk I/O Log Based Metrics](pipeline/inputs/disk-io-metrics.md)
98-
* [Docker Events](pipeline/inputs/docker-events.md)
99-
* [Docker Log Based Metrics](pipeline/inputs/docker-metrics.md)
96+
* [CPU metrics](pipeline/inputs/cpu-metrics.md)
97+
* [Disk I/O metrics](pipeline/inputs/disk-io-metrics.md)
98+
* [Docker events](pipeline/inputs/docker-events.md)
99+
* [Docker metrics](pipeline/inputs/docker-metrics.md)
100100
* [Dummy](pipeline/inputs/dummy.md)
101+
* [Ebpf](pipeline/inputs/ebpf.md)
101102
* [Elasticsearch](pipeline/inputs/elasticsearch.md)
103+
* [Exec WASI](pipeline/inputs/exec-wasi.md)
102104
* [Exec](pipeline/inputs/exec.md)
103-
* [Exec Wasi](pipeline/inputs/exec-wasi.md)
104-
* [Ebpf](pipeline/inputs/ebpf.md)
105-
* [Fluent Bit Metrics](pipeline/inputs/fluentbit-metrics.md)
105+
* [Fluent Bit metrics](pipeline/inputs/fluentbit-metrics.md)
106106
* [Forward](pipeline/inputs/forward.md)
107107
* [Head](pipeline/inputs/head.md)
108108
* [Health](pipeline/inputs/health.md)
109109
* [HTTP](pipeline/inputs/http.md)
110110
* [Kafka](pipeline/inputs/kafka.md)
111-
* [Kernel Logs](pipeline/inputs/kernel-logs.md)
112-
* [Kubernetes Events](pipeline/inputs/kubernetes-events.md)
113-
* [Memory Metrics](pipeline/inputs/memory-metrics.md)
111+
* [Kernel logs](pipeline/inputs/kernel-logs.md)
112+
* [Kubernetes events](pipeline/inputs/kubernetes-events.md)
113+
* [Memory metrics](pipeline/inputs/memory-metrics.md)
114114
* [MQTT](pipeline/inputs/mqtt.md)
115-
* [Network I/O Log Based Metrics](pipeline/inputs/network-io-metrics.md)
116-
* [NGINX Exporter Metrics](pipeline/inputs/nginx.md)
117-
* [Node Exporter Metrics](pipeline/inputs/node-exporter-metrics.md)
115+
* [Network I/O metrics](pipeline/inputs/network-io-metrics.md)
116+
* [NGINX exporter metrics](pipeline/inputs/nginx.md)
117+
* [Node exporter metrics](pipeline/inputs/node-exporter-metrics.md)
118118
* [OpenTelemetry](pipeline/inputs/opentelemetry.md)
119-
* [Podman Metrics](pipeline/inputs/podman-metrics.md)
120-
* [Process Exporter Metrics](pipeline/inputs/process-exporter-metrics.md)
121-
* [Process Log Based Metrics](pipeline/inputs/process.md)
122-
* [Prometheus Remote Write](pipeline/inputs/prometheus-remote-write.md)
123-
* [Prometheus Scrape Metrics](pipeline/inputs/prometheus-scrape-metrics.md)
119+
* [Podman metrics](pipeline/inputs/podman-metrics.md)
120+
* [Process exporter metrics](pipeline/inputs/process-exporter-metrics.md)
121+
* [Process metrics](pipeline/inputs/process.md)
122+
* [Prometheus remote write](pipeline/inputs/prometheus-remote-write.md)
123+
* [Prometheus scrape Metrics](pipeline/inputs/prometheus-scrape-metrics.md)
124124
* [Random](pipeline/inputs/random.md)
125-
* [Serial Interface](pipeline/inputs/serial-interface.md)
125+
* [Serial interface](pipeline/inputs/serial-interface.md)
126126
* [Splunk](pipeline/inputs/splunk.md)
127-
* [Standard Input](pipeline/inputs/standard-input.md)
127+
* [Standard input](pipeline/inputs/standard-input.md)
128128
* [StatsD](pipeline/inputs/statsd.md)
129129
* [Syslog](pipeline/inputs/syslog.md)
130130
* [Systemd](pipeline/inputs/systemd.md)
131131
* [Tail](pipeline/inputs/tail.md)
132132
* [TCP](pipeline/inputs/tcp.md)
133133
* [Thermal](pipeline/inputs/thermal.md)
134134
* [UDP](pipeline/inputs/udp.md)
135-
* [Windows Event Log](pipeline/inputs/windows-event-log.md)
136-
* [Windows Event Log (winevtlog)](pipeline/inputs/windows-event-log-winevtlog.md)
137-
* [Windows Exporter Metrics](pipeline/inputs/windows-exporter-metrics.md)
135+
* [Windows Event logs (winevtlog)](pipeline/inputs/windows-event-log-winevtlog.md)
136+
* [Windows Event logs (winlog)](pipeline/inputs/windows-event-log.md)
137+
* [Windows exporter metrics](pipeline/inputs/windows-exporter-metrics.md)
138138
* [Parsers](pipeline/parsers/README.md)
139-
* [Configuring Parser](pipeline/parsers/configuring-parser.md)
139+
* [Configuring parsers](pipeline/parsers/configuring-parser.md)
140140
* [JSON](pipeline/parsers/json.md)
141-
* [Regular Expression](pipeline/parsers/regular-expression.md)
141+
* [Regular expression](pipeline/parsers/regular-expression.md)
142142
* [LTSV](pipeline/parsers/ltsv.md)
143143
* [Logfmt](pipeline/parsers/logfmt.md)
144144
* [Decoders](pipeline/parsers/decoders.md)
145145
* [Processors](pipeline/processors/README.md)
146-
* [Content Modifier](pipeline/processors/content-modifier.md)
146+
* [Content modifier](pipeline/processors/content-modifier.md)
147147
* [Labels](pipeline/processors/labels.md)
148-
* [Metrics Selector](pipeline/processors/metrics-selector.md)
149-
* [OpenTelemetry Envelope](pipeline/processors/opentelemetry-envelope.md)
148+
* [Metrics selector](pipeline/processors/metrics-selector.md)
149+
* [OpenTelemetry envelope](pipeline/processors/opentelemetry-envelope.md)
150150
* [Sampling](pipeline/processors/sampling.md)
151151
* [SQL](pipeline/processors/sql.md)
152152
* [Filters as processors](pipeline/processors/filters.md)
153153
* [Conditional processing](pipeline/processors/conditional-processing.md)
154154
* [Filters](pipeline/filters/README.md)
155-
* [AWS Metadata](pipeline/filters/aws-metadata.md)
155+
* [AWS metadata](pipeline/filters/aws-metadata.md)
156156
* [CheckList](pipeline/filters/checklist.md)
157-
* [ECS Metadata](pipeline/filters/ecs-metadata.md)
157+
* [ECS metadata](pipeline/filters/ecs-metadata.md)
158158
* [Expect](pipeline/filters/expect.md)
159-
* [GeoIP2 Filter](pipeline/filters/geoip2-filter.md)
159+
* [GeoIP2 filter](pipeline/filters/geoip2-filter.md)
160160
* [Grep](pipeline/filters/grep.md)
161161
* [Kubernetes](pipeline/filters/kubernetes.md)
162-
* [Log to Metrics](pipeline/filters/log_to_metrics.md)
162+
* [Logs to metrics](pipeline/filters/log_to_metrics.md)
163163
* [Lua](pipeline/filters/lua.md)
164-
* [Parser](pipeline/filters/parser.md)
165-
* [Record Modifier](pipeline/filters/record-modifier.md)
166164
* [Modify](pipeline/filters/modify.md)
167165
* [Multiline](pipeline/filters/multiline-stacktrace.md)
168166
* [Nest](pipeline/filters/nest.md)
169167
* [Nightfall](pipeline/filters/nightfall.md)
170-
* [Rewrite Tag](pipeline/filters/rewrite-tag.md)
171-
* [Standard Output](pipeline/filters/standard-output.md)
168+
* [Parser](pipeline/filters/parser.md)
169+
* [Record modifier](pipeline/filters/record-modifier.md)
170+
* [Rewrite tag](pipeline/filters/rewrite-tag.md)
171+
* [Standard output](pipeline/filters/standard-output.md)
172172
* [Sysinfo](pipeline/filters/sysinfo.md)
173+
* [Tensorflow](pipeline/filters/tensorflow.md)
173174
* [Throttle](pipeline/filters/throttle.md)
174175
* [Type Converter](pipeline/filters/type-converter.md)
175-
* [Tensorflow](pipeline/filters/tensorflow.md)
176176
* [Wasm](pipeline/filters/wasm.md)
177177
* [Outputs](pipeline/outputs/README.md)
178178
* [Amazon CloudWatch](pipeline/outputs/cloudwatch.md)
179179
* [Amazon Kinesis Data Firehose](pipeline/outputs/firehose.md)
180180
* [Amazon Kinesis Data Streams](pipeline/outputs/kinesis.md)
181181
* [Amazon S3](pipeline/outputs/s3.md)
182+
* [Apache SkyWalking](pipeline/outputs/skywalking.md)
182183
* [Azure Blob](pipeline/outputs/azure_blob.md)
183184
* [Azure Data Explorer](pipeline/outputs/azure_kusto.md)
184185
* [Azure Log Analytics](pipeline/outputs/azure.md)
@@ -189,34 +190,32 @@
189190
* [Dynatrace](pipeline/outputs/dynatrace.md)
190191
* [Elasticsearch](pipeline/outputs/elasticsearch.md)
191192
* [File](pipeline/outputs/file.md)
192-
* [FlowCounter](pipeline/outputs/flowcounter.md)
193+
* [Flow counter](pipeline/outputs/flowcounter.md)
193194
* [Forward](pipeline/outputs/forward.md)
194-
* [GELF](pipeline/outputs/gelf.md)
195+
* [Graylog Extended Log Format (GELF)](pipeline/outputs/gelf.md)
195196
* [Google Chronicle](pipeline/outputs/chronicle.md)
196197
* [Google Cloud BigQuery](pipeline/outputs/bigquery.md)
197198
* [HTTP](pipeline/outputs/http.md)
198199
* [InfluxDB](pipeline/outputs/influxdb.md)
199-
* [Kafka](pipeline/outputs/kafka.md)
200+
* [Kafka Producer](pipeline/outputs/kafka.md)
200201
* [Kafka REST Proxy](pipeline/outputs/kafka-rest-proxy.md)
201202
* [LogDNA](pipeline/outputs/logdna.md)
202203
* [Loki](pipeline/outputs/loki.md)
203-
* [Microsoft Fabric](pipeline/outputs/azure_kusto.md)
204204
* [NATS](pipeline/outputs/nats.md)
205205
* [New Relic](pipeline/outputs/new-relic.md)
206-
* [NULL](pipeline/outputs/null.md)
206+
* [Null](pipeline/outputs/null.md)
207207
* [Observe](pipeline/outputs/observe.md)
208208
* [OpenObserve](pipeline/outputs/openobserve.md)
209209
* [OpenSearch](pipeline/outputs/opensearch.md)
210210
* [OpenTelemetry](pipeline/outputs/opentelemetry.md)
211-
* [Oracle Log Analytics](pipeline/outputs/oci-logging-analytics.md)
211+
* [Oracle Cloud Infrastructure Logging Analytics](pipeline/outputs/oci-logging-analytics.md)
212212
* [PostgreSQL](pipeline/outputs/postgresql.md)
213-
* [Prometheus Exporter](pipeline/outputs/prometheus-exporter.md)
214-
* [Prometheus Remote Write](pipeline/outputs/prometheus-remote-write.md)
215-
* [SkyWalking](pipeline/outputs/skywalking.md)
213+
* [Prometheus exporter](pipeline/outputs/prometheus-exporter.md)
214+
* [Prometheus remote write](pipeline/outputs/prometheus-remote-write.md)
216215
* [Slack](pipeline/outputs/slack.md)
217216
* [Splunk](pipeline/outputs/splunk.md)
218217
* [Stackdriver](pipeline/outputs/stackdriver.md)
219-
* [Standard Output](pipeline/outputs/standard-output.md)
218+
* [Standard output](pipeline/outputs/standard-output.md)
220219
* [Syslog](pipeline/outputs/syslog.md)
221220
* [TCP and TLS](pipeline/outputs/tcp-and-tls.md)
222221
* [Treasure Data](pipeline/outputs/treasure-data.md)

pipeline/filters/aws-metadata.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# AWS Metadata
1+
# AWS metadata
22

33
The _AWS Filter_ enriches logs with AWS Metadata. The plugin adds the EC2 instance ID and availability zone to log records. To use this plugin, you must be running in EC2 and have the [instance metadata service enabled](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/configuring-instance-metadata-service.html).
44

@@ -188,4 +188,4 @@ The resulting logs might look like this:
188188

189189
```text
190190
{"log"=>"aws is awesome", "az"=>"us-east-1a", "ec2_instance_id"=>"i-0e66fc7f9809d7168", "Name"=>"fluent-bit-docs-example", "project"=>"fluentbit"}
191-
```
191+
```

pipeline/filters/log_to_metrics.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
description: Generate metrics from logs
33
---
44

5-
# Log to metrics
5+
# Logs to metrics
66

77
![](https://static.scarf.sh/a.png?x-pxid=768830f6-8d2d-4231-9e5e-259ce6797ba5)
88

@@ -536,5 +536,4 @@ The `+Inf` bucket will always be included regardless of the buckets you specify.
536536

537537
{% endhint %}
538538

539-
This filter also attaches Kubernetes labels to each metric, identical to the behavior
540-
of `label_field`. This results in two sets for the histogram.
539+
This filter also attaches Kubernetes labels to each metric, identical to the behavior of `label_field`. This results in two sets for the histogram.

pipeline/filters/nest.md

Lines changed: 8 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,7 @@ The _Nest_ filter plugin lets you operate on or with nested data. Its modes of o
77

88
## Example usage for `nest`
99

10-
As an example using JSON notation, to nest keys matching the `Wildcard` value `Key*`
11-
under a new key `NestKey` the transformation becomes:
10+
As an example using JSON notation, to nest keys matching the `Wildcard` value `Key*` under a new key `NestKey` the transformation becomes:
1211

1312
Input:
1413

@@ -34,8 +33,7 @@ Output:
3433

3534
## Example usage for `lift`
3635

37-
As an example using JSON notation, to lift keys nested under the `Nested_under` value
38-
`NestKey*` the transformation becomes:
36+
As an example using JSON notation, to lift keys nested under the `Nested_under` value `NestKey*` the transformation becomes:
3937

4038
Input:
4139

@@ -86,9 +84,7 @@ To start filtering records, run the filter from the command line or through the
8684

8785
Using the command line mode requires quotes to parse the wildcard properly. The use of a configuration file is recommended.
8886

89-
The following command loads the _mem_ plugin. Then the _nest_ filter matches the
90-
wildcard rule to the keys and nests the keys matching `Mem.*` under the new key
91-
`NEST`.
87+
The following command loads the _mem_ plugin. Then the _nest_ filter matches the wildcard rule to the keys and nests the keys matching `Mem.*` under the new key `NEST`.
9288

9389
```shell
9490
./fluent-bit -i mem -p 'tag=mem.local' -F nest -p 'Operation=nest' -p 'Wildcard=Mem.*' -p 'Nest_under=Memstats' -p 'Remove_prefix=Mem.' -m '*' -o stdout
@@ -133,7 +129,7 @@ pipeline:
133129
Wildcard Mem.*
134130
Nest_under Memstats
135131
Remove_prefix Mem.
136-
132+
137133
[OUTPUT]
138134
Name stdout
139135
Match *
@@ -210,10 +206,10 @@ pipeline:
210206
Operation lift
211207
Nested_under Stats
212208
Remove_prefix NESTED
213-
209+
214210
[OUTPUT]
215211
Name stdout
216-
Match *
212+
Match *
217213
```
218214

219215
{% endtab %}
@@ -228,8 +224,7 @@ pipeline:
228224

229225
## Example 3 - `nest` 3 levels deep
230226

231-
This example takes the keys starting with `Mem.*` and nests them under `LAYER1`,
232-
which is then nested under `LAYER2`, which is nested under `LAYER3`.
227+
This example takes the keys starting with `Mem.*` and nests them under `LAYER1`, which is then nested under `LAYER2`, which is nested under `LAYER3`.
233228

234229
### Deep `nest` configuration file
235230

@@ -453,4 +448,4 @@ pipeline:
453448
"Lifted3_Lifted2_Lifted1_Mem.used"=>1253912,
454449
"Lifted3_Lifted2_Lifted1_Mem.free"=>2796996
455450
}
456-
```
451+
```

pipeline/filters/nightfall.md

Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,8 @@
11
# Nightfall
22

3-
The _Nightfall_ filter scans logs for sensitive data and redacts any sensitive
4-
portions. This filter supports scanning for various sensitive information, ranging
5-
from API keys and Personally Identifiable Information (PII) to custom regular
6-
expressions you define. You can configure what to scan for in the
7-
[Nightfall Dashboard](https://app.nightfall.ai).
3+
The _Nightfall_ filter scans logs for sensitive data and redacts any sensitive portions. This filter supports scanning for various sensitive information, ranging from API keys and Personally Identifiable Information (PII) to custom regular expressions you define. You can configure what to scan for in the [Nightfall Dashboard](https://app.nightfall.ai).
84

9-
This filter isn't enabled by default in version 1.9.0 due to a typo. To enable it,
10-
set the flag ```-DFLB_FILTER_NIGHTFALL=ON``` when building. This is fixed for
11-
versions 1.9.1 and later.
5+
This filter isn't enabled by default in version 1.9.0 due to a typo. To enable it, set the flag ```-DFLB_FILTER_NIGHTFALL=ON``` when building. This is fixed for versions 1.9.1 and later.
126

137
## Configuration parameters
148

@@ -44,7 +38,7 @@ pipeline:
4438
policy_id: 5991946b-1cc8-4c38-9240-72677029a3f7
4539
sampling_rate: 1
4640
tls.ca_path: /etc/ssl/certs
47-
41+
4842
outputs:
4943
- name: stdout
5044
match: '*'
@@ -104,4 +98,4 @@ Which results in output like:
10498
[0] app.log: [1644464790.280412000, {"A"=>"there is nothing sensitive here", "B"=>[{"A"=>"my credit card number is *******************"}, {"A"=>"*********** is my social security."}], "C"=>false, "D"=>"key ********************"}]
10599
[2022/02/09 19:47:25] [ info] [filter:nightfall:nightfall.0] Nightfall request http_do=0, HTTP Status: 200
106100
[0] app.log: [1644464845.675431000, {"A"=>"a very safe string"}]
107-
```
101+
```

pipeline/filters/parser.md

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -139,8 +139,7 @@ You can see the records `{"data":"100 0.5 true This is example"}` are parsed.
139139

140140
By default, the parser plugin only keeps the parsed fields in its output.
141141

142-
If you enable `Reserve_Data`, all other fields are preserved. First the contents of the corresponding parsers file,
143-
depending on the choice for YAML or classic configurations, would be as follows:
142+
If you enable `Reserve_Data`, all other fields are preserved. First the contents of the corresponding parsers file, depending on the choice for YAML or classic configurations, would be as follows:
144143

145144
{% tabs %}
146145
{% tab title="parsers.yaml" %}
@@ -210,7 +209,7 @@ pipeline:
210209
Key_Name data
211210
Parser dummy_test
212211
Reserve_Data On
213-
212+
214213
[OUTPUT]
215214
Name stdout
216215
Match *
@@ -255,8 +254,7 @@ ______ _ _ ______ _ _ ___ _____
255254
[0] dummy.data: [[1750325240.682903000, {}], {"INT"=>"100", "FLOAT"=>"0.5", "BOOL"=>"true", "STRING"=>"This is example", "key1"=>"value1", "key2"=>"value2"}]
256255
```
257256

258-
If you enable `Reserve_Data` and `Preserve_Key`, the original key field will also be preserved. First the contents of
259-
the corresponding parsers file, depending on the choice for YAML or classic configurations, would be as follows:
257+
If you enable `Reserve_Data` and `Preserve_Key`, the original key field will also be preserved. First the contents of the corresponding parsers file, depending on the choice for YAML or classic configurations, would be as follows:
260258

261259
{% tabs %}
262260
{% tab title="parsers.yaml" %}
@@ -328,7 +326,7 @@ pipeline:
328326
Parser dummy_test
329327
Reserve_Data On
330328
Preserve_Key On
331-
329+
332330
[OUTPUT]
333331
Name stdout
334332
Match *
@@ -371,4 +369,4 @@ ______ _ _ ______ _ _ ___ _____
371369
[0] dummy.data: [[1750325678.572817000, {}], {"INT"=>"100", "FLOAT"=>"0.5", "BOOL"=>"true", "STRING"=>"This is example", "data"=>"100 0.5 true This is example", "key1"=>"value1", "key2"=>"value2"}]
372370
[0] dummy.data: [[1750325679.574538000, {}], {"INT"=>"100", "FLOAT"=>"0.5", "BOOL"=>"true", "STRING"=>"This is example", "data"=>"100 0.5 true This is example", "key1"=>"value1", "key2"=>"value2"}]
373371
[0] dummy.data: [[1750325680.569750000, {}], {"INT"=>"100", "FLOAT"=>"0.5", "BOOL"=>"true", "STRING"=>"This is example", "data"=>"100 0.5 true This is example", "key1"=>"value1", "key2"=>"value2"}]
374-
```
372+
```

pipeline/filters/record-modifier.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,7 @@ The plugin supports the following configuration parameters:
1616

1717
## Get started
1818

19-
To start filtering records, run the filter from the command line or through a
20-
configuration file.
19+
To start filtering records, run the filter from the command line or through a configuration file.
2120

2221
This is a sample `in_mem` record to filter.
2322

@@ -27,8 +26,7 @@ This is a sample `in_mem` record to filter.
2726

2827
### Append fields
2928

30-
The following configuration file appends a product name and hostname to a record
31-
using an environment variable:
29+
The following configuration file appends a product name and hostname to a record using an environment variable:
3230

3331
{% tabs %}
3432
{% tab title="fluent-bit.yaml" %}

pipeline/filters/rewrite-tag.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
description: Powerful and flexible routing
33
---
44

5-
# Rewrite Tag
5+
# Rewrite tag
66

77
Tags make [routing](../../concepts/data-pipeline/router.md) possible. Tags are set in the configuration of the `INPUT` definitions where the records are generated. There are scenarios where you might want to modify the tag in the pipeline to perform more advanced and flexible routing.
88

@@ -253,4 +253,4 @@ The records generated are handled by the internal emitter, so the new records ar
253253

254254
The _Emitter_ is an internal Fluent Bit plugin that allows other components of the pipeline to emit custom records. On this case `rewrite_tag` creates an emitter instance to use it exclusively to emit records, allowing for granular control of who is emitting what.
255255

256-
Change the Emitter name in the metrics by adding the `Emitter_Name` configuration property described previously.
256+
Change the Emitter name in the metrics by adding the `Emitter_Name` configuration property described previously.

0 commit comments

Comments
 (0)