Skip to content
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -7,21 +7,15 @@ description: Learn about the YAML configuration file used by Fluent Bit
<img referrerpolicy="no-referrer-when-downgrade"
src="https://static.scarf.sh/a.png?x-pxid=864c6f0e-8977-4838-8772-84416943548e" alt="" />

One of the ways to configure Fluent Bit is using a YAML configuration file that works at a global scope.
One of the ways to configure Fluent Bit is using a YAML configuration file that works at a global scope. These YAML configuration files support the following sections:

The YAML configuration file supports the following sections:

- `Env`
- `Includes`
- `Service`
- `Pipeline`
- `Inputs`
- `Filters`
- `Outputs`

The YAML configuration file doesn't support the following sections:

- `Parsers`
- `env`
- `includes`
- `service`
- `pipeline`
- `inputs`
- `outputs`
- `filters`
Comment on lines +12 to +18
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

parsers can handle in YAML type of configurations:

This is an example configuration of YAML but it's still valid for the format:

env:
  FLB_ENV: "1"

parsers:
  - name: json
    format: json

plugins:
  - /opt/fluent-bit/plugins/custom.so

service:
  flush: 1
  log_level: info

pipeline:
  inputs:
    - name: tail
      path: /var/log/syslog
  outputs:
    - name: stdout

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤔 what does an example configuration like this do? does it apply a JSON parser to all inputs?

Copy link
Contributor

@cosmo0920 cosmo0920 Nov 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, parsers, multiline_parsers, and plugins are able to be handled in the same YAML configurations.
But in classix format, parsers, multiline_parsers, and plugins should be handled with parsers_file and plugins_file and loaded as separated other files.
This could be an advantage of YAML format of Fluent BIt configurations.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, that makes sense :) so those are optional sections you can use to define custom parsers/plugins/etc

of the other sections that you can include in a YAML config, are any of them required? I'm wondering if it would be helpful to specify which ones you always need to include versus which ones are optional.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The suggested sections are all of optional. So, it's not needed to put in every configuration.


{% hint style="info" %}
YAML configuration is used in the smoke tests for containers. An always-correct up-to-date example is here: <https://github.com/fluent/fluent-bit/blob/master/packaging/testing/smoke/container/fluent-bit.yaml>.
Expand Down Expand Up @@ -71,7 +65,7 @@ The `service` section defines the global properties of the service. The Service
| `dns.mode` | Sets the primary transport layer protocol used by the asynchronous DNS resolver, which can be overridden on a per plugin basis | `UDP` |
| `log_file` | Absolute path for an optional log file. By default, all logs are redirected to the standard error interface(`stderr`). | _none_ |
| `log_level` | Set the logging verbosity level. Allowed values are: `off`, `error`, `warn`, `info`, `debug`, and `trace`. Values are accumulative. For example, if `debug` is set, it will include `error`, `warning`, `info`, and `debug`. `trace` mode is only available if Fluent Bit was built with the `WITH_TRACE` option enabled. | `info` |
| `parsers_file` | Path for a `parsers` configuration file. Only a single entry is supported. | _none_ |
| `parsers_file` | Path for a file that defines custom parsers. Only a single entry is supported. | _none_ |
| `plugins_file` | Path for a `plugins` configuration file. A `plugins` configuration file allows the definition of paths for external plugins; for an example, [see here](https://github.com/fluent/fluent-bit/blob/master/conf/plugins.conf). | _none_ |
| `streams_file` | Path for the Stream Processor configuration file. Learn more about [Stream Processing configuration](../../../stream-processing/introduction.md). | _none_ |
| `http_server` | Enable built-in HTTP server. | `Off` |
Expand Down
Original file line number Diff line number Diff line change
@@ -1,22 +1,21 @@
# Pipeline

The `pipeline` section defines the flow of how data is collected, processed, and sent to its final destination. It encompasses the following core concepts:
The `pipeline` section defines the flow of how data is collected, processed, and sent to its final destination. This section contains three subsections:

| Name | Description |
| ---- | ----------- |
| `inputs` | Specifies the name of the plugin responsible for collecting or receiving data. This component serves as the data source in the pipeline. Examples of input plugins include `tail`, `http`, and `random`. |
| `processors` | **Unique to YAML configuration**, processors are specialized plugins that handle data processing directly attached to input plugins. Unlike filters, processors aren't dependent on tag or matching rules. Instead, they work closely with the input to modify or enrich the data before it reaches the filtering or output stages. Processors are defined within an input plugin section. |
| `filters` | Filters are used to transform, enrich, or discard events based on specific criteria. They allow matching tags using strings or regular expressions, providing a more flexible way to manipulate data. Filters run as part of the main event loop and can be applied across multiple inputs and filters. Examples of filters include `modify`, `grep`, and `nest`. |
| `outputs` | Defines the destination for processed data. Outputs specify where the data will be sent, such as to a remote server, a file, or another service. Each output plugin is configured with matching rules to determine which events are sent to that destination. Common output plugins include `stdout`, `elasticsearch`, and `kafka`. |

## Example configuration

{% hint style="info" %}

**Note:** Processors can be enabled only by using the YAML configuration format. Classic mode configuration format doesn't support processors.
Unlike filters, processors and parsers aren't defined within a unified section of YAML configuration files and don't use tag matching. Instead, each input or output defined in the configuration file can have a `parsers` key and `processors` key to configure the parsers and processors for that specific plugin.

{% endhint %}

## Example configuration

Here's an example of a pipeline configuration:

{% tabs %}
Expand Down