Skip to content

Kafka input plug-in skip reading from single partition from assigned topic #338

@patelprakashp

Description

@patelprakashp

Logstash information:

Please include the following information:

  1. Logstash version : 6.8.14
  2. Logstash installation source : Docker
  3. How is Logstash being run : Docker
  4. How was the Logstash Plugin installed: Default
  5. Kafka-Input Plug-in : 9.1.0
  6. Kafka client : 2.3.0

I have logstash running on Kubernetes and consuming events from 100+ topics.Each topic have 3+ partitions. Thus Single consumer group subscribed to 100+ topics and consumes events from 500+ partitions. Below are the Kafka input configuration.I am running 10 instances of logstash on kubernetes so total 100 consumer threads consuming from 500 partitions.

auto_offset_reset => "earliest"
enable_auto_commit => "true"
group_id => "test_logstash"
consumer_threads => "10"
max_poll_records => "500"
heartbeat_interval_ms => "9000"
session_timeout_ms => "30000"
fetch_max_bytes => "10485760"
max_partition_fetch_bytes => "524288"
client_id => "test_logstash"
decorate_events => true
partition_assignment_strategy => "org.apache.kafka.clients.consumer.RoundRobinAssignor"

Kafka consumer stops reading data from single partition although its subscribed and assigned to consumer group.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions