Replies: 1 comment
-
|
Any update? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have configured fluentd with kafka output plugin.
Fluentd's configuration is the below:
<store ignore_error> <buffer topic> @type file @log_level debug path /data/fluentdlogs/logskafka timekey 1d flush_thread_count 4 **chunk_limit_size** 2MB overflow_action drop_oldest_chunk flush_mode interval flush_interval 5s total_limit_size 2GB max_send_limit_bytes 5000000 </buffer> @type kafka2 @log_level debug get_kafka_client_log true brokers <ip>:<port> topic_key topic default_topic lalala <format> @type json </format> use_event_time false required_acks 1 request_timeout 30s reconnect_on_error true reload_on_failure true reload_connections false </store>At kafka side I have set message.max.bytes=2000000
I get the below error at fluentd's logs:
2023-09-28 07:42:22 +0000 [warn]: #1 failed to flush the buffer. retry_times=1 next_retry_time=2023-09-28 07:42:24 +0000 chunk="60666729618260cf29737b0db4803708" error_class= Kafka::MessageSizeTooLarge error="Kafka::MessageSizeTooLarge"but
[td-agent@ztsl-bssc-fluentd-statefulset-2 /]$ ls -lRt /data/fluentdlogs/logskafka/ | grep 60666729618260cf29737b0db4803708
-rw-r--r--. 1 td-agent 2006 2043098 Sep 28 07:42 buffer.q60666729618260cf29737b0db4803708.log
-rw-r--r--. 1 td-agent 2006 84 Sep 28 07:42 buffer.q60666729618260cf29737b0db4803708.log.meta
Seems that the chunk is biger than 2 MB. Why is this since chunk_limit_size is 2MB?
Beta Was this translation helpful? Give feedback.
All reactions