Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions pipeline/processors/README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,17 @@
# Processors

Processors are components that can modify, transform, or enhance data records as they flow through the Fluent Bit pipeline.

## Available processors

Fluent Bit offers the following processors:

- [Content Modifier](content-modifier.md): Manipulate message content, metadata/attributes for logs and traces
- [Labels](labels.md): Add, update or delete labels in records
- [Metrics Selector](metrics-selector.md): Select specific metrics
- [OpenTelemetry Envelope](opentelemetry-envelope.md): Convert logs to OpenTelemetry format
- [SQL](sql.md): Process records using SQL queries

## Features

- [Conditional Processing](conditional-processing.md): Apply processors only to records that meet specific conditions
207 changes: 207 additions & 0 deletions pipeline/processors/conditional-processing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,207 @@
# Conditional processing

Conditional processing allows you to selectively apply processors to log records based on field values. This feature enables you to create processing pipelines that apply processors only to records that match specific criteria.

## Configuration format

Conditional processing is available for processors in the YAML configuration format. To apply a processor conditionally, you add a `condition` block to the processor configuration:

```yaml
- name: processor_name
# Regular processor configuration...
condition:
op: and|or
rules:
- field: "$field_name"
op: comparison_operator
value: comparison_value
# Additional rules...
```

### Condition operators

The `op` field in the condition block specifies the logical operator to apply across all rules:

| Operator | Description |
| --- | --- |
| `and` | All rules must evaluate to true for the condition to be true |
| `or` | At least one rule must evaluate to true for the condition to be true |

### Rules

Each rule consists of:

- `field`: The field to evaluate (must use [record accessor syntax](/administration/configuring-fluent-bit/classic-mode/record-accessor.md) with `$` prefix)
- `op`: The comparison operator
- `value`: The value to compare against

### Comparison operators

The following comparison operators are supported:

| Operator | Description |
| --- | --- |
| `eq` | Equal to |
| `neq` | Not equal to |
| `gt` | Greater than |
| `lt` | Less than |
| `gte` | Greater than or equal to |
| `lte` | Less than or equal to |
| `regex` | Matches regular expression |
| `not_regex` | Does not match regular expression |
| `in` | Value is in the specified array |
| `not_in` | Value is not in the specified array |

### Field access

You can access record fields using [record accessor syntax](/administration/configuring-fluent-bit/classic-mode/record-accessor.md):

- Simple fields: `$field`
- Nested fields: `$parent['child']['subchild']`

## Examples

### Simple condition

Process records only when the HTTP method is POST:

```yaml
pipeline:
inputs:
- name: dummy
dummy: '{"request": {"method": "GET", "path": "/api/v1/resource"}}'
tag: request.log
processors:
logs:
- name: content_modifier
action: insert
key: modified_if_post
value: true
condition:
op: and
rules:
- field: "$request['method']"
op: eq
value: "POST"
```
### Multiple conditions with AND
Apply a processor only when both conditions are met:
```yaml
pipeline:
inputs:
- name: dummy
dummy: '{"request": {"method": "POST", "path": "/api/v1/sensitive-data"}}'
tag: request.log
processors:
logs:
- name: content_modifier
action: insert
key: requires_audit
value: true
condition:
op: and
rules:
- field: "$request['method']"
op: eq
value: "POST"
- field: "$request['path']"
op: regex
value: "\/sensitive-.*"
```
### OR condition example
Flag records that meet any of multiple criteria:
```yaml
pipeline:
inputs:
- name: dummy
dummy: '{"request": {"method": "GET", "path": "/api/v1/resource", "status_code": 200, "response_time": 150}}'
tag: request.log
processors:
logs:
- name: content_modifier
action: insert
key: requires_performance_check
value: true
condition:
op: or
rules:
- field: "$request['response_time']"
op: gt
value: 100
- field: "$request['status_code']"
op: gte
value: 400
```
### Using IN operator
Apply a processor when a value matches one of multiple options:
```yaml
pipeline:
inputs:
- name: dummy
dummy: '{"request": {"method": "GET", "path": "/api/v1/resource"}}'
tag: request.log
processors:
logs:
- name: content_modifier
action: insert
key: high_priority_method
value: true
condition:
op: and
rules:
- field: "$request['method']"
op: in
value: ["POST", "PUT", "DELETE"]
```
## Multiple processors with conditions
You can chain multiple conditional processors to create advanced processing pipelines:
```yaml
pipeline:
inputs:
- name: dummy
dummy: '{"log": "Error: Connection refused", "level": "error", "service": "api-gateway"}'
tag: app.log
processors:
logs:
- name: content_modifier
action: insert
key: alert
value: true
condition:
op: and
rules:
- field: "$level"
op: eq
value: "error"
- field: "$service"
op: in
value: ["api-gateway", "authentication", "database"]

- name: content_modifier
action: insert
key: paging_required
value: true
condition:
op: and
rules:
- field: "$log"
op: regex
value: "(?i)(connection refused|timeout|crash)"
- field: "$level"
op: in
value: ["error", "fatal"]
```
This configuration would add the `alert` field to error logs from critical services, and add the `paging_required` field to errors containing specific critical patterns.