Skip to content

Commit a07428d

Browse files
Inputs: fix remaining vale/markdownlint errors
Signed-off-by: Alexa Kreizinger <[email protected]>
1 parent e360988 commit a07428d

23 files changed

+135
-118
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ description: High Performance Telemetry Agent for Logs, Metrics and Traces
2424
- Connect nearly any source to nearly any destination using preexisting plugins
2525
- Extensibility:
2626
- Write input, filter, or output plugins in the C language
27-
- WASM: [WASM Filter Plugins](development/wasm-filter-plugins.md) or [WASM Input Plugins](development/wasm-input-plugins.md)
27+
- Wasm: [Wasm Filter Plugins](development/wasm-filter-plugins.md) or [Wasm Input Plugins](development/wasm-input-plugins.md)
2828
- Write [Filters in Lua](pipeline/filters/lua.md) or [Output plugins in Golang](development/golang-output-plugins.md)
2929
- [Monitoring](administration/monitoring.md): Expose internal metrics over HTTP in JSON and [Prometheus](https://prometheus.io/) format
3030
- [Stream Processing](stream-processing/introduction.md): Perform data selection and transformation using simple SQL queries

SUMMARY.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -98,7 +98,7 @@
9898
* [Docker events](pipeline/inputs/docker-events.md)
9999
* [Docker metrics](pipeline/inputs/docker-metrics.md)
100100
* [Dummy](pipeline/inputs/dummy.md)
101-
* [Ebpf](pipeline/inputs/ebpf.md)
101+
* [eBPF](pipeline/inputs/ebpf.md)
102102
* [Elasticsearch](pipeline/inputs/elasticsearch.md)
103103
* [Exec WASI](pipeline/inputs/exec-wasi.md)
104104
* [Exec](pipeline/inputs/exec.md)

pipeline/inputs/disk-io-metrics.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,4 +78,4 @@ pipeline:
7878

7979
Total interval (sec) = `Interval_Sec` + `(Interval_Nsec` / 1000000000)
8080

81-
For example: `1.5s` = `1s` + `500000000ns`
81+
For example: `1.5s` = `1s` + `500000000ns`

pipeline/inputs/ebpf.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Ebpf
1+
# eBPF
22

33
{% hint style="info" %}
44
This plugin is experimental and might be unstable. Use it in development or testing environments only. Its features and behavior are subject to change.

pipeline/inputs/elasticsearch.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,12 +11,12 @@ The plugin supports the following configuration parameters:
1111
| `buffer_max_size` | Set the maximum size of buffer. | `4M` |
1212
| `buffer_chunk_size` | Set the buffer chunk size. | `512K` |
1313
| `tag_key` | Specify a key name for extracting as a tag. | `NULL` |
14-
| `meta_key` | Specify a key name for meta information. | "@meta" |
15-
| `hostname` | Specify hostname or fully qualified domain name. This parameter can be used for "sniffing" (auto-discovery of) cluster node information. | "localhost" |
16-
| `version` | Specify Elasticsearch server version. This parameter is effective for checking a version of Elasticsearch/OpenSearch server version. | "8.0.0" |
14+
| `meta_key` | Specify a key name for meta information. | `@meta` |
15+
| `hostname` | Specify hostname or fully qualified domain name. This parameter can be used for "sniffing" (auto-discovery of) cluster node information. | `localhost` |
16+
| `version` | Specify Elasticsearch server version. This parameter is effective for checking a version of Elasticsearch/OpenSearch server version. | `8.0.0` |
1717
| `threaded` | Indicates whether to run this input in its own [thread](../../administration/multithreading.md#inputs). | `false` |
1818

19-
The Elasticsearch cluster uses "sniffing" to optimize the connections between its cluster and clients. Elasticsearch can build its cluster and dynamically generate a connection list which is called "sniffing". The `hostname` will be used for sniffing information and this is handled by the sniffing endpoint.
19+
The Elasticsearch cluster uses "sniffing" to optimize the connections between its cluster and clients, which means it builds its cluster and dynamically generate a connection list . The `hostname` will be used for sniffing information and this is handled by the sniffing endpoint.
2020

2121
## Get started
2222

pipeline/inputs/exec-wasi.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ The plugin supports the following configuration parameters:
1010
|:-------------------|:---------------------------------------------------------------------------------------------------------------------------------------------|
1111
| `WASI_Path` | The location of a Wasm program file. |
1212
| `Parser` | Specify the name of a parser to interpret the entry as a structured message. |
13-
| `Accessible_Paths` | Specify the allowed list of paths to be able to access paths from WASM programs. |
13+
| `Accessible_Paths` | Specify the allowed list of paths to be able to access paths from Wasm programs. |
1414
| `Interval_Sec` | Polling interval (seconds). |
1515
| `Interval_NSec` | Polling interval (nanosecond). |
1616
| `Wasm_Heap_Size` | Size of the heap size of Wasm execution. Review [unit sizes](../../administration/configuring-fluent-bit/unit-sizes.md) for allowed values. |
@@ -23,11 +23,11 @@ The plugin supports the following configuration parameters:
2323

2424
Here is a configuration example.
2525

26-
`in_exec_wasi` can handle parsers. To retrieve from structured data from a WASM program, you must create a `parser.conf`:
26+
`in_exec_wasi` can handle parsers. To retrieve from structured data from a Wasm program, you must create a `parser.conf`:
2727

2828
The `Time_Format` should be aligned for the format of your using timestamp.
2929

30-
This example assumes the WASM program writes JSON style strings to `stdout`.
30+
This example assumes the Wasm program writes JSON style strings to `stdout`.
3131

3232
{% tabs %}
3333
{% tab title="parsers.yaml" %}

pipeline/inputs/exec.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -194,4 +194,4 @@ The previous script would be safer if written with:
194194
-p command='echo '"$(printf '%q' "$@")" \
195195
```
196196

197-
It's generally best to avoid dynamically generating the command or handling untrusted arguments.
197+
It's generally best to avoid dynamically generating the command or handling untrusted arguments.

pipeline/inputs/forward.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ pipeline:
5757
port: 24224
5858
buffer_chunk_size: 1M
5959
buffer_max_size: 6M
60-
60+
6161
outputs:
6262
- name: stdout
6363
match: '*'
@@ -82,7 +82,7 @@ pipeline:
8282
{% endtab %}
8383
{% endtabs %}
8484

85-
## Fluent Bit and Secure Forward Setup
85+
## Fluent Bit and secure forward setup
8686

8787
In Fluent Bit v3 or later, `in_forward` can handle secure forward protocol.
8888

pipeline/inputs/http.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ pipeline:
5151
- name: http
5252
listen: 0.0.0.0
5353
port: 8888
54-
54+
5555
outputs:
5656
- name: stdout
5757
match: app.log
@@ -171,7 +171,7 @@ The `success_header` parameter lets you set multiple HTTP headers on success. Th
171171
pipeline:
172172
inputs:
173173
- name: http
174-
success_header:
174+
success_header:
175175
- X-Custom custom-answer
176176
- X-Another another-answer
177177
```
@@ -233,4 +233,4 @@ pipeline:
233233

234234
```shell
235235
fluent-bit -i http -p port=8888 -o stdout
236-
```
236+
```

pipeline/inputs/kafka.md

Lines changed: 22 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Kafka Consumer
1+
# Kafka
22

33
The _Kafka_ input plugin enables Fluent Bit to consume messages directly from one or more [Apache Kafka](https://kafka.apache.org/) topics. By subscribing to specified topics, this plugin efficiently collects and forwards Kafka messages for further processing within your Fluent Bit pipeline.
44

@@ -22,7 +22,7 @@ This plugin uses the official [librdkafka C library](https://github.com/edenhill
2222

2323
## Get started
2424

25-
To subscribe to or collect messages from Apache Kafka, run the plugin from the command line or through the configuration file as shown below.
25+
To subscribe to or collect messages from Apache Kafka, run the plugin from the command line or through the configuration file as shown in the following examples.
2626

2727
### Command line
2828

@@ -132,41 +132,34 @@ Every message received is then processed with `kafka.lua` and sent back to the `
132132

133133
The example can be executed locally with `make start` in the `examples/kafka_filter` directory (`docker/compose` is used).
134134

135-
## AWS MSK IAM Authentication
135+
## AWS MSK IAM authentication
136136

137-
*Available since Fluent Bit v4.0.4*
137+
Fluent Bit v4.0.4 and later supports authentication to Amazon MSK (Managed Streaming for Apache Kafka) clusters using AWS IAM. This lets you securely connect to MSK brokers with AWS credentials, leveraging IAM roles and policies for access control.
138138

139-
Fluent Bit supports authentication to Amazon MSK (Managed Streaming for Apache Kafka) clusters using AWS IAM. This allows you to securely connect to MSK brokers with AWS credentials, leveraging IAM roles and policies for access control.
140-
141-
### Prerequisites
142-
143-
**Build Requirements**
139+
### Build requirements
144140

145141
If you are compiling Fluent Bit from source, ensure the following requirements are met to enable AWS MSK IAM support:
146142

147143
- The packages `libsasl2` and `libsasl2-dev` must be installed on your build environment.
148144

149-
**Runtime Requirements**
145+
### Runtime requirements
146+
150147
- **Network Access:** Fluent Bit must be able to reach your MSK broker endpoints (AWS VPC setup).
151-
- **AWS Credentials:** Provide credentials using any supported AWS method:
148+
- **AWS Credentials:** Provide these AWS credentials using any supported AWS method. These credentials are discovered by default when `aws_msk_iam` flag is enabled.
152149
- IAM roles (recommended for EC2, ECS, or EKS)
153150
- Environment variables (`AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`)
154151
- AWS credentials file (`~/.aws/credentials`)
155152
- Instance metadata service (IMDS)
153+
- **IAM Permissions:** The credentials must allow access to the target MSK cluster, as shown in the following example policy.
156154

157-
Note these credentials are discovery by default when `aws_msk_iam` flag is enabled.
158-
159-
- **IAM Permissions:** The credentials must allow access to the target MSK cluster (see example policy below).
160-
161-
### Configuration Parameters
155+
### Configuration parameters [#config-aws]
162156

163-
| Property | Description | Type | Required |
164-
|---------------------------|---------------------------------------------------|---------|--------------------------------|
165-
| `aws_msk_iam` | Enable AWS MSK IAM authentication | Boolean | No (default: false) |
166-
| `aws_msk_iam_cluster_arn` | Full ARN of the MSK cluster for region extraction | String | Yes (if `aws_msk_iam` is true) |
157+
| Property | Description | Required |
158+
| -------- | ----------- | -------- |
159+
| `aws_msk_iam` | If `true`, enables AWS MSK IAM authentication. Possible values: `true`, `false`. | `false` |
160+
| `aws_msk_iam_cluster_arn` | Full ARN of the MSK cluster for region extraction. This value is required if `aws_msk_iam` is `true`. | _none_ |
167161

168-
169-
### Configuration Example
162+
### Configuration example
170163

171164
```yaml
172165
pipeline:
@@ -182,9 +175,13 @@ pipeline:
182175
match: '*'
183176
```
184177
185-
### Example AWS IAM Policy
178+
### Example AWS IAM policy
179+
180+
{% hint style="info" %}
186181
187-
> **Note:** IAM policies and permissions can be complex and may vary depending on your organization's security requirements. If you are unsure about the correct permissions or best practices, please consult with your AWS administrator or an AWS expert who is familiar with MSK and IAM security.
182+
IAM policies and permissions can be complex and might vary depending on your organization's security requirements. If you are unsure about the correct permissions or best practices, consult your AWS administrator or an AWS expert who is familiar with MSK and IAM security.
183+
184+
{% endhint %}
188185
189186
The AWS credentials used by Fluent Bit must have permission to connect to your MSK cluster. Here is a minimal example policy:
190187
@@ -206,4 +203,4 @@ The AWS credentials used by Fluent Bit must have permission to connect to your M
206203
}
207204
]
208205
}
209-
```
206+
```

0 commit comments

Comments
 (0)