You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: pipeline/inputs/ebpf.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Ebpf
1
+
# eBPF
2
2
3
3
{% hint style="info" %}
4
4
This plugin is experimental and might be unstable. Use it in development or testing environments only. Its features and behavior are subject to change.
Copy file name to clipboardExpand all lines: pipeline/inputs/elasticsearch.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,12 +11,12 @@ The plugin supports the following configuration parameters:
11
11
|`buffer_max_size`| Set the maximum size of buffer. |`4M`|
12
12
|`buffer_chunk_size`| Set the buffer chunk size. |`512K`|
13
13
|`tag_key`| Specify a key name for extracting as a tag. |`NULL`|
14
-
|`meta_key`| Specify a key name for meta information. |"@meta"|
15
-
|`hostname`| Specify hostname or fully qualified domain name. This parameter can be used for "sniffing" (auto-discovery of) cluster node information. |"localhost"|
16
-
|`version`| Specify Elasticsearch server version. This parameter is effective for checking a version of Elasticsearch/OpenSearch server version. |"8.0.0"|
14
+
|`meta_key`| Specify a key name for meta information. |`@meta`|
15
+
|`hostname`| Specify hostname or fully qualified domain name. This parameter can be used for "sniffing" (auto-discovery of) cluster node information. |`localhost`|
16
+
|`version`| Specify Elasticsearch server version. This parameter is effective for checking a version of Elasticsearch/OpenSearch server version. |`8.0.0`|
17
17
|`threaded`| Indicates whether to run this input in its own [thread](../../administration/multithreading.md#inputs). |`false`|
18
18
19
-
The Elasticsearch cluster uses "sniffing" to optimize the connections between its cluster and clients. Elasticsearch can build its cluster and dynamically generate a connection list which is called "sniffing". The `hostname` will be used for sniffing information and this is handled by the sniffing endpoint.
19
+
The Elasticsearch cluster uses "sniffing" to optimize the connections between its cluster and clients, which means it builds its cluster and dynamically generate a connection list . The `hostname` will be used for sniffing information and this is handled by the sniffing endpoint.
|`Wasm_Heap_Size`| Size of the heap size of Wasm execution. Review [unit sizes](../../administration/configuring-fluent-bit/unit-sizes.md) for allowed values. |
@@ -23,11 +23,11 @@ The plugin supports the following configuration parameters:
23
23
24
24
Here is a configuration example.
25
25
26
-
`in_exec_wasi` can handle parsers. To retrieve from structured data from a WASM program, you must create a `parser.conf`:
26
+
`in_exec_wasi` can handle parsers. To retrieve from structured data from a Wasm program, you must create a `parser.conf`:
27
27
28
28
The `Time_Format` should be aligned for the format of your using timestamp.
29
29
30
-
This example assumes the WASM program writes JSON style strings to `stdout`.
30
+
This example assumes the Wasm program writes JSON style strings to `stdout`.
Copy file name to clipboardExpand all lines: pipeline/inputs/kafka.md
+22-25Lines changed: 22 additions & 25 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Kafka Consumer
1
+
# Kafka
2
2
3
3
The _Kafka_ input plugin enables Fluent Bit to consume messages directly from one or more [Apache Kafka](https://kafka.apache.org/) topics. By subscribing to specified topics, this plugin efficiently collects and forwards Kafka messages for further processing within your Fluent Bit pipeline.
4
4
@@ -22,7 +22,7 @@ This plugin uses the official [librdkafka C library](https://github.com/edenhill
22
22
23
23
## Get started
24
24
25
-
To subscribe to or collect messages from Apache Kafka, run the plugin from the command line or through the configuration file as shown below.
25
+
To subscribe to or collect messages from Apache Kafka, run the plugin from the command line or through the configuration file as shown in the following examples.
26
26
27
27
### Command line
28
28
@@ -132,41 +132,34 @@ Every message received is then processed with `kafka.lua` and sent back to the `
132
132
133
133
The example can be executed locally with `make start` in the `examples/kafka_filter` directory (`docker/compose` is used).
134
134
135
-
## AWS MSK IAM Authentication
135
+
## AWS MSK IAM authentication
136
136
137
-
*Available since Fluent Bit v4.0.4*
137
+
Fluent Bit v4.0.4 and later supports authentication to Amazon MSK (Managed Streaming for Apache Kafka) clusters using AWS IAM. This lets you securely connect to MSK brokers with AWS credentials, leveraging IAM roles and policies for access control.
138
138
139
-
Fluent Bit supports authentication to Amazon MSK (Managed Streaming for Apache Kafka) clusters using AWS IAM. This allows you to securely connect to MSK brokers with AWS credentials, leveraging IAM roles and policies for access control.
140
-
141
-
### Prerequisites
142
-
143
-
**Build Requirements**
139
+
### Build requirements
144
140
145
141
If you are compiling Fluent Bit from source, ensure the following requirements are met to enable AWS MSK IAM support:
146
142
147
143
- The packages `libsasl2` and `libsasl2-dev` must be installed on your build environment.
148
144
149
-
**Runtime Requirements**
145
+
### Runtime requirements
146
+
150
147
-**Network Access:** Fluent Bit must be able to reach your MSK broker endpoints (AWS VPC setup).
151
-
-**AWS Credentials:** Provide credentials using any supported AWS method:
148
+
-**AWS Credentials:** Provide these AWS credentials using any supported AWS method. These credentials are discovered by default when `aws_msk_iam` flag is enabled.
|`aws_msk_iam`| Enable AWS MSK IAM authentication| Boolean | No (default: false) |
166
-
|`aws_msk_iam_cluster_arn`| Full ARN of the MSK cluster for region extraction| String | Yes (if `aws_msk_iam` is true)|
157
+
| Property | Description |Required |
158
+
|--------|-----------|--------|
159
+
|`aws_msk_iam`| If `true`, enables AWS MSK IAM authentication. Possible values: `true`, `false`. |`false`|
160
+
|`aws_msk_iam_cluster_arn`| Full ARN of the MSK cluster for region extraction. This value is required if `aws_msk_iam` is `true`. |_none_|
167
161
168
-
169
-
### Configuration Example
162
+
### Configuration example
170
163
171
164
```yaml
172
165
pipeline:
@@ -182,9 +175,13 @@ pipeline:
182
175
match: '*'
183
176
```
184
177
185
-
### Example AWS IAM Policy
178
+
### Example AWS IAM policy
179
+
180
+
{% hint style="info" %}
186
181
187
-
> **Note:** IAM policies and permissions can be complex and may vary depending on your organization's security requirements. If you are unsure about the correct permissions or best practices, please consult with your AWS administrator or an AWS expert who is familiar with MSK and IAM security.
182
+
IAM policies and permissions can be complex and might vary depending on your organization's security requirements. If you are unsure about the correct permissions or best practices, consult your AWS administrator or an AWS expert who is familiar with MSK and IAM security.
183
+
184
+
{% endhint %}
188
185
189
186
The AWS credentials used by Fluent Bit must have permission to connect to your MSK cluster. Here is a minimal example policy:
190
187
@@ -206,4 +203,4 @@ The AWS credentials used by Fluent Bit must have permission to connect to your M
0 commit comments