Skip to content

bug: try_build_batch miscalculates serialized size #1746

@Oppen

Description

@Oppen

From cantina#40. Description transcript:

try_build_batch() selects entries that will be included in the next batch to be uploaded to S3 by iterating over the batch queue (from lowest to highest fee) and removing entries until the batch fits certain conditions in this loop, the issue is that the check for batch_size with max_batch_byte_size only takes into account the sum of individual cbor serialized entries, while the final batch uploaded to S3 is serialized as a vector of VerificationData in finalize_batch(). A cbor serialized vector/array of elements is always bigger than the sum of the individual cbor serialized elements because the length of the array/vector is prepended to the serialized entries.

In case the sum of valid proofs size is exactly max_batch_byte_size, it would lead to the batcher submitting a batch that exceeds max_batch_byte_size as specified in config-files/*.yaml. The batch would be consistently rejected by operators running with the default config.

To understand the root cause of the issue, here is a simplified example of cbor encoding (you can try it in https://cbor.me):

42 -> cbor: 18 2A
[42] -> cbor: 81 18 2A
81 is prepended to indicate an array of length 1. Our issue is that in try_build_batch() the total size of the batch is calculated as in 1. while the uploaded batch to S3 is serialized as in 2.

Metadata

Metadata

Assignees

No one assigned

    Labels

    auditbatcherissues within aligned-batchercantinaAudit report from Cantina

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions