-
Notifications
You must be signed in to change notification settings - Fork 141
Description
Describe the question/issue
Currently init script treats downloaded or local config files containing SQL stream configurations as a normal files to include. Just like parsers this is a wrong approach. Although the container does not crash, @INCLUDE does not treat any [STREAM_TASK] as a useful config - this has to be set as a startup argument of fluent-bit or provided as streams_file config in the service section.
Ideally the init Go script should treat files containing string [STREAM_TASK] as an exception similar to [PARSER] and add it to the [SERVICE] section.
Configuration
Original configuration before migration to S3 loading:
[SERVICE]
Daemon Off
Flush 5
Log_Level error
Parsers_File custom-parsers.conf
Streams_File custom-streams.conf
[FILTER]
name multiline
Match *
Buffer True
multiline.key_content log
mode partial_message
[FILTER]
Name modify
Match *
Add system_name myserv
[FILTER]
Name parser
Match *
Key_Name log
Parser json_parser
Preserve_Key On
Reserve_Data True
[OUTPUT]
Name s3
Match logs.s3
region ${AWS_REGION}
bucket ${BUCKET_NAME}
s3_key_format /${AWS_ACCOUNT_ID}/${AWS_REGION}/${SERVICE}/%Y/%m/%d/%H/$UUID.json
canned_acl bucket-owner-full-control
auto_retry_requests true
upload_timeout ${UPLOAD_TIMEOUT}
total_file_size ${TOTAL_FILE_SIZE}
use_put_object On
[OUTPUT]
Name datadog
Match logs.datadog
Host http-intake.logs.datadoghq.eu
TLS on
compress gzip
apikey ${DD_API_KEY}
dd_service ${DD_SERVICE}
dd_source ${SOURCE}
dd_message_key log
dd_tags ${DD_TAG_LIST}
custom-streams.conf
[STREAM_TASK]
Name datadog_logs
Exec CREATE STREAM datadog WITH (tag='logs.datadog') AS SELECT * from TAG:'*myserv*' WHERE extra['target'] = 'datadog';
[STREAM_TASK]
Name s3_logs
Exec CREATE STREAM s3 WITH (tag='logs.s3') AS SELECT * from TAG:'*myserv*' WHERE @record.contains(container_id) AND NOT @record.contains(extra['target']);
custom-parsers.conf
[PARSER]
Name docker
Format json
Time_Keep On
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L
[PARSER]
Name json
Format json
[PARSER]
Name json_parser
Format json
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L
Time_Keep On
Decode_Field escaped_utf8 log_processed do_next
Decode_Field_As json log_processed
In the new setup I uploaded all the files to S3 and added aws_fluent_bit_init_s3_1 to aws_fluent_bit_init_s3_3 variables with full object ARNs.
The parsers file is included correctly into the command. The streams file is used as @INCLUDE. I changed the main config file like this:
[SERVICE]
Daemon Off
Flush 5
Log_Level error
#Parsers_File custom-parsers.conf
#Streams_File custom-streams.conf
...
The setup works when I uncomment and change Streams_File to /init/fluent-bit-init-s3-files/custom-streams.conf. However, this is kind of a guessing game. Ideally the file should be properly handled by the init script.
Fluent Bit Version Info
AWS for FluentBit init-2.34.1
FluentBit v1.19
Cluster Details
ECS, sidecar deployment of FluentBit