Skip to content

max_buffer_bytesize for async_producer #913

@alan-sapaad

Description

@alan-sapaad

I am able to set max_buffer_bytesize for a synchronous producer and as expected I am receiving the error Kafka::BufferOverflow: Cannot produce to topic, max buffer bytesize (960000 bytes) reached

But I need to use an async_producer with delivery_interval set to 50ms (0.05). I've set max_buffer_bytesize when initializing the producer. But I am able to push as many messages as possible to the Queue and all I can see is the queue size getting bigger every second (even if I don't produce any message). I am guessing it must be the [:deliver_message, nil] packets that are being pushed into the queue every 50ms.

I am not able to receive any errors from the underlying sync producer. How do I make sure that the package size is under 1MB? I am using Heroku Kafka and they will not let any packages over 1MB. I want the async_producer.produce to error out if the buffer size will be more then 950kb with the current message added.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions