Skip to content

Excessive Kafka client logging in WebSphere Application Server with OpenTelemetry Java Agent #13833

@tinnapat

Description

@tinnapat

Description

After instrumenting our application running on WebSphere Application Server with the OpenTelemetry Java Agent, we're experiencing excessive logging related to Kafka clients. These logs are generated continuously (multiple times per second), quickly filling up disk space and overwhelming other important log information in the SystemOut.log file.

The logs appear to be coming from the Kafka client instrumentation, showing repeated consumer creation, configuration, and closing cycles. This is causing:

  1. Rapid disk space consumption
  2. Difficulty in finding relevant application logs
  3. Potential performance impact due to excessive I/O operations

Environment

  • Application Server: IBM WebSphere Application Server 9.0.5.17
  • OpenTelemetry Java Agent Version: 2.15.0
  • Java Version: 1.8.0_381
  • OS: Red Hat Linux 8

Log Sample

[5/2/25 15:05:25:720 ICT] 0000021b SystemOut     O  WARN | [Consumer clientId=consumer-decision_monitoring-112429, groupId=decision_monitoring] Error while fetching metadata with correlation id 72 : {decision_monitoring=UNKNOWN_TOPIC_OR_PARTITION}
[5/2/25 15:05:25:759 ICT] 000002ee SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | Kafka startTimeMs: 1746173125759
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | [Consumer clientId=consumer-null-112607, groupId=null] Subscribed to partition(s): PYBIBATCHINDEXPROCESSOR-3
[5/2/25 15:05:25:760 ICT] 000002ee SystemOut     O  INFO | [Consumer clientId=consumer-null-112607, groupId=null] Seeking to offset 0 for partition PYBIBATCHINDEXPROCESSOR-3
[5/2/25 15:05:25:760 ICT] 000002e0 SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:760 ICT] 000002e0 SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:760 ICT] 000002e0 SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:761 ICT] 000002e0 SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:762 ICT] 000002da SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:763 ICT] 000002da SystemOut     O  INFO | App info kafka.consumer for consumer-null-112583 unregistered
[5/2/25 15:05:25:764 ICT] 000002da SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112608
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:765 ICT] 000002da SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | Kafka startTimeMs: 1746173125765
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | [Consumer clientId=consumer-null-112608, groupId=null] Subscribed to partition(s): PXINTERACTIONAGGREGATOR-4
[5/2/25 15:05:25:766 ICT] 000002da SystemOut     O  INFO | [Consumer clientId=consumer-null-112608, groupId=null] Seeking to offset 0 for partition PXINTERACTIONAGGREGATOR-4
[5/2/25 15:05:25:771 ICT] 000002e0 SystemOut     O  INFO | App info kafka.consumer for consumer-null-112581 unregistered
[5/2/25 15:05:25:772 ICT] 000002e0 SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112609
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:774 ICT] 000002e0 SystemOut     O  INFO | Kafka startTimeMs: 1746173125774
[5/2/25 15:05:25:775 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Subscribed to partition(s): PYSASBATCHINDEXCLASSESPROCESSOR-1, PYSASBATCHINDEXCLASSESPROCESSOR-5
[5/2/25 15:05:25:775 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Seeking to offset 0 for partition PYSASBATCHINDEXCLASSESPROCESSOR-1
[5/2/25 15:05:25:775 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Seeking to offset 0 for partition PYSASBATCHINDEXCLASSESPROCESSOR-5
[5/2/25 15:05:25:804 ICT] 000002ee SystemOut     O  INFO | [Consumer clientId=consumer-null-112607, groupId=null] Cluster ID: fb3aH165RHylsZv94gcO7Q
[5/2/25 15:05:25:805 ICT] 000002e0 SystemOut     O  INFO | [Consumer clientId=consumer-null-112609, groupId=null] Cluster ID: fb3aH165RHylsZv94gcO7Q
[5/2/25 15:05:25:819 ICT] 000002e3 SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:820 ICT] 000002e3 SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:820 ICT] 000002e3 SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:820 ICT] 000002e3 SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:821 ICT] 000002e3 SystemOut     O  INFO | App info kafka.consumer for consumer-null-112585 unregistered
[5/2/25 15:05:25:822 ICT] 000002e3 SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112610
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:824 ICT] 000002e3 SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | Kafka startTimeMs: 1746173125824
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | [Consumer clientId=consumer-null-112610, groupId=null] Subscribed to partition(s): PYNLPREPORTINGDATAPROCESSOR-3, PYNLPREPORTINGDATAPROCESSOR-5
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | [Consumer clientId=consumer-null-112610, groupId=null] Seeking to offset 0 for partition PYNLPREPORTINGDATAPROCESSOR-3
[5/2/25 15:05:25:825 ICT] 000002e3 SystemOut     O  INFO | [Consumer clientId=consumer-null-112610, groupId=null] Seeking to offset 0 for partition PYNLPREPORTINGDATAPROCESSOR-5
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Metrics scheduler closed
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Closing reporter io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Closing reporter org.apache.kafka.common.metrics.JmxReporter
[5/2/25 15:05:25:827 ICT] 000002e1 SystemOut     O  INFO | Metrics reporters closed
[5/2/25 15:05:25:829 ICT] 000002e1 SystemOut     O  INFO | App info kafka.consumer for consumer-null-112584 unregistered
[5/2/25 15:05:25:830 ICT] 000002e1 SystemOut     O  INFO | ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 1000
        auto.offset.reset = earliest
        bootstrap.servers = [10.225.100.77:9092, 10.225.100.76:9092]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = consumer-null-112611
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = [class io.opentelemetry.javaagent.shaded.instrumentation.kafkaclients.common.v0_11.internal.OpenTelemetryMetricsReporter]
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.connect.timeout.ms = null
        sasl.login.read.timeout.ms = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.login.retry.backoff.max.ms = 10000
        sasl.login.retry.backoff.ms = 100
        sasl.mechanism = GSSAPI
        sasl.oauthbearer.clock.skew.seconds = 30
        sasl.oauthbearer.expected.audience = null
        sasl.oauthbearer.expected.issuer = null
        sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
        sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
        sasl.oauthbearer.jwks.endpoint.url = null
        sasl.oauthbearer.scope.claim.name = scope
        sasl.oauthbearer.sub.claim.name = sub
        sasl.oauthbearer.token.endpoint.url = null
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 30000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = IbmX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.2
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

[5/2/25 15:05:25:831 ICT] 000002e1 SystemOut     O  WARN | The configuration 'auto.commit.interval.ms' was supplied but isn't a known config.
[5/2/25 15:05:25:832 ICT] 000002e1 SystemOut     O  INFO | Kafka version: 3.1.0
[5/2/25 15:05:25:832 ICT] 000002e1 SystemOut     O  INFO | Kafka commitId: 37edeed0777bacb3

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions