Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
206 changes: 121 additions & 85 deletions administration/configuring-fluent-bit/yaml/pipeline-section.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,28 +11,42 @@ The `pipeline` section defines the flow of how data is collected, processed, and

## Example configuration

{% hint style="info" %}

**Note:** Processors can be enabled only by using the YAML configuration format. Classic mode configuration format
doesn't support processors.

{% endhint %}

Here's an example of a pipeline configuration:

{% tabs %}
{% tab title="fluent-bit.yaml" %}

```yaml
pipeline:
inputs:
- name: tail
path: /var/log/example.log
parser: json

processors:
logs:
- name: record_modifier
filters:
- name: grep
match: '*'
regex: key pattern

outputs:
- name: stdout
match: '*'
inputs:
- name: tail
path: /var/log/example.log
parser: json

processors:
logs:
- name: record_modifier

filters:
- name: grep
match: '*'
regex: key pattern

outputs:
- name: stdout
match: '*'
```

{% endtab %}
{% endtabs %}

## Pipeline processors

Processors operate on specific signals such as logs, metrics, and traces. They're attached to an input plugin and must specify the signal type they will process.
Expand All @@ -41,75 +55,91 @@ Processors operate on specific signals such as logs, metrics, and traces. They'r

In the following example, the `content_modifier` processor inserts or updates (upserts) the key `my_new_key` with the value `123` for all log records generated by the tail plugin. This processor is only applied to log signals:

{% tabs %}
{% tab title="fluent-bit.yaml" %}

```yaml
parsers:
- name: json
format: json
- name: json
format: json

pipeline:
inputs:
- name: tail
path: /var/log/example.log
parser: json

processors:
logs:
- name: content_modifier
action: upsert
key: my_new_key
value: 123
filters:
- name: grep
match: '*'
regex: key pattern

outputs:
- name: stdout
match: '*'
inputs:
- name: tail
path: /var/log/example.log
parser: json

processors:
logs:
- name: content_modifier
action: upsert
key: my_new_key
value: 123

filters:
- name: grep
match: '*'
regex: key pattern

outputs:
- name: stdout
match: '*'
```

{% endtab %}
{% endtabs %}

Here is a more complete example with multiple processors:

{% tabs %}
{% tab title="fluent-bit.yaml" %}

```yaml
service:
log_level: info
http_server: on
http_listen: 0.0.0.0
http_port: 2021
log_level: info
http_server: on
http_listen: 0.0.0.0
http_port: 2021

pipeline:
inputs:
- name: random
tag: test-tag
interval_sec: 1
processors:
logs:
- name: modify
add: hostname monox
- name: lua
call: append_tag
code: |
function append_tag(tag, timestamp, record)
new_record = record
new_record["tag"] = tag
return 1, timestamp, new_record
end

outputs:
- name: stdout
match: '*'
processors:
logs:
- name: lua
call: add_field
code: |
function add_field(tag, timestamp, record)
new_record = record
new_record["output"] = "new data"
return 1, timestamp, new_record
end
inputs:
- name: random
tag: test-tag
interval_sec: 1

processors:
logs:
- name: modify
add: hostname monox

- name: lua
call: append_tag
code: |
function append_tag(tag, timestamp, record)
new_record = record
new_record["tag"] = tag
return 1, timestamp, new_record
end

outputs:
- name: stdout
match: '*'

processors:
logs:
- name: lua
call: add_field
code: |
function add_field(tag, timestamp, record)
new_record = record
new_record["output"] = "new data"
return 1, timestamp, new_record
end
```

{% endtab %}
{% endtabs %}

Processors can be attached to inputs and outputs.

### How Processors are different from Filters
Expand All @@ -128,22 +158,28 @@ You can configure existing [Filters](https://docs.fluentbit.io/manual/pipeline/f

In the following example, the `grep` filter is used as a processor to filter log events based on a pattern:

{% tabs %}
{% tab title="fluent-bit.yaml" %}

```yaml
parsers:
- name: json
format: json
- name: json
format: json

pipeline:
inputs:
- name: tail
path: /var/log/example.log
parser: json

processors:
logs:
- name: grep
regex: log aa
outputs:
- name: stdout
match: '*'
inputs:
- name: tail
path: /var/log/example.log
parser: json

processors:
logs:
- name: grep
regex: log aa
outputs:
- name: stdout
match: '*'
```

{% endtab %}
{% endtabs %}
9 changes: 8 additions & 1 deletion pipeline/processors/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,13 @@ creating a performance bottleneck.
Additionally, filters can be implemented in a way that mimics the behavior of
processors, but processors can't be implemented in a way that mimics filters.

{% hint style="info" %}

**Note:** Processors can be enabled only by using the YAML configuration format. Classic mode configuration format
doesn't support processors.

{% endhint %}

## Available processors

Fluent Bit offers the following processors:
Expand All @@ -28,4 +35,4 @@ Fluent Bit offers the following processors:
Compatible processors include the following features:

- [Conditional Processing](conditional-processing.md): Selectively apply processors
to logs based on the value of fields that those logs contain.
to logs based on the value of fields that those logs contain.
Loading