diff --git a/administration/configuring-fluent-bit/yaml/configuration-file.md b/administration/configuring-fluent-bit/yaml/configuration-file.md
index 8edf5407e..682366b00 100644
--- a/administration/configuring-fluent-bit/yaml/configuration-file.md
+++ b/administration/configuring-fluent-bit/yaml/configuration-file.md
@@ -7,21 +7,15 @@ description: Learn about the YAML configuration file used by Fluent Bit
-One of the ways to configure Fluent Bit is using a YAML configuration file that works at a global scope.
+One of the ways to configure Fluent Bit is using a YAML configuration file that works at a global scope. These YAML configuration files support the following sections:
-The YAML configuration file supports the following sections:
-
-- `Env`
-- `Includes`
-- `Service`
-- `Pipeline`
- - `Inputs`
- - `Filters`
- - `Outputs`
-
-The YAML configuration file doesn't support the following sections:
-
-- `Parsers`
+- `env`
+- `includes`
+- `service`
+- `pipeline`
+ - `inputs`
+ - `outputs`
+ - `filters`
{% hint style="info" %}
YAML configuration is used in the smoke tests for containers. An always-correct up-to-date example is here: .
@@ -71,7 +65,7 @@ The `service` section defines the global properties of the service. The Service
| `dns.mode` | Sets the primary transport layer protocol used by the asynchronous DNS resolver, which can be overridden on a per plugin basis | `UDP` |
| `log_file` | Absolute path for an optional log file. By default, all logs are redirected to the standard error interface(`stderr`). | _none_ |
| `log_level` | Set the logging verbosity level. Allowed values are: `off`, `error`, `warn`, `info`, `debug`, and `trace`. Values are accumulative. For example, if `debug` is set, it will include `error`, `warning`, `info`, and `debug`. `trace` mode is only available if Fluent Bit was built with the `WITH_TRACE` option enabled. | `info` |
-| `parsers_file` | Path for a `parsers` configuration file. Only a single entry is supported. | _none_ |
+| `parsers_file` | Path for a file that defines custom parsers. Only a single entry is supported. | _none_ |
| `plugins_file` | Path for a `plugins` configuration file. A `plugins` configuration file allows the definition of paths for external plugins; for an example, [see here](https://github.com/fluent/fluent-bit/blob/master/conf/plugins.conf). | _none_ |
| `streams_file` | Path for the Stream Processor configuration file. Learn more about [Stream Processing configuration](../../../stream-processing/introduction.md). | _none_ |
| `http_server` | Enable built-in HTTP server. | `Off` |
diff --git a/administration/configuring-fluent-bit/yaml/pipeline-section.md b/administration/configuring-fluent-bit/yaml/pipeline-section.md
index 21a0f7381..73dd9374d 100644
--- a/administration/configuring-fluent-bit/yaml/pipeline-section.md
+++ b/administration/configuring-fluent-bit/yaml/pipeline-section.md
@@ -1,22 +1,21 @@
# Pipeline
-The `pipeline` section defines the flow of how data is collected, processed, and sent to its final destination. It encompasses the following core concepts:
+The `pipeline` section defines the flow of how data is collected, processed, and sent to its final destination. This section contains the following subsections:
| Name | Description |
| ---- | ----------- |
| `inputs` | Specifies the name of the plugin responsible for collecting or receiving data. This component serves as the data source in the pipeline. Examples of input plugins include `tail`, `http`, and `random`. |
-| `processors` | **Unique to YAML configuration**, processors are specialized plugins that handle data processing directly attached to input plugins. Unlike filters, processors aren't dependent on tag or matching rules. Instead, they work closely with the input to modify or enrich the data before it reaches the filtering or output stages. Processors are defined within an input plugin section. |
| `filters` | Filters are used to transform, enrich, or discard events based on specific criteria. They allow matching tags using strings or regular expressions, providing a more flexible way to manipulate data. Filters run as part of the main event loop and can be applied across multiple inputs and filters. Examples of filters include `modify`, `grep`, and `nest`. |
| `outputs` | Defines the destination for processed data. Outputs specify where the data will be sent, such as to a remote server, a file, or another service. Each output plugin is configured with matching rules to determine which events are sent to that destination. Common output plugins include `stdout`, `elasticsearch`, and `kafka`. |
-## Example configuration
-
{% hint style="info" %}
-**Note:** Processors can be enabled only by using the YAML configuration format. Classic mode configuration format doesn't support processors.
+Unlike filters, processors and parsers aren't defined within a unified section of YAML configuration files and don't use tag matching. Instead, each input or output defined in the configuration file can have a `parsers` key and `processors` key to configure the parsers and processors for that specific plugin.
{% endhint %}
+## Example configuration
+
Here's an example of a pipeline configuration:
{% tabs %}