Skip to content

fix(ssz): Merkleizer list Feed now uses correct chunk limit and actual length for mixin#10795

Open
kevaundray wants to merge 2 commits intoNethermindEth:masterfrom
kevaundray:kw/merkleizer-fix
Open

fix(ssz): Merkleizer list Feed now uses correct chunk limit and actual length for mixin#10795
kevaundray wants to merge 2 commits intoNethermindEth:masterfrom
kevaundray:kw/merkleizer-fix

Conversation

@kevaundray
Copy link

@kevaundray kevaundray commented Mar 12, 2026

Fixes Closes Resolves #

I used List[uint8, 64] containing [0xAB] as a minimal example.

Here are the manual steps to make sure I didn't miss anything:

1) Pack

This is where we pack the data into bytes. The data is 0xAB, its already a byte so this is a no-op.

2) Chunk data

A chunk/leaf is 32 bytes. We now chunk our data into 32 bytes to form a leaf.

We get a single leaf that looks like: [AB 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00]

Its essentially just our byte, zero padded.

3) Chunk Limit

We now compute how many 32 byte leaves we need to represent List[uint8, 64]. This list can hold at most 64 bytes of information, so we need 2 leaves/chunks.

Or more precisely, we can compute it as chunk_limit = ceil(64 * 1 / 32) = 2

4) Merkleize with chunk limit

In this step we want to create our merkle root. We create a merkle tree with 2 leaves, since our chunk limit was 2.

The first leaf will be our chunked data (from step 2):

[AB 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00]

The second leaf will be zeroes. Visually this looks like:

     merkle_root
      /          \
  Leaf 0        Leaf 1
[AB 00 ...]   [00 00 ...]
(our data)    (zero padding)

The merkle root is then computed as merkle_root = SHA256(Leaf0 || Leaf1)

5) Mix in Length

In this step, we mix in the actual length of the data. We only had 1 byte 0xAB so the length is 1.

The "final_root" or the output value is SHA256(merkle_root || [0x01, 0x00, ..., 0x00])

Bugs found

In step 4, we were not using chunk_limit to form the merkle tree. It used the size of the data, so we just had one leaf.

In step 5, we were using the chunk_limit (64) to compute the final root instead of the length of the data (1).

Changes

  • This PR fixes the merkleization logic.

Related to #10793

Types of changes

What types of changes does your code introduce?

  • Bugfix (a non-breaking change that fixes an issue)
  • New feature (a non-breaking change that adds functionality)
  • Breaking change (a change that causes existing functionality not to work as expected)
  • Optimization
  • Refactoring
  • Documentation update
  • Build-related changes
  • Other: Description

Testing

Requires testing

  • Yes
  • No

If yes, did you write tests?

  • Yes
  • No

Notes on testing

Optional. Remove if not applicable.

Documentation

Requires documentation update

  • Yes
  • No

If yes, link the PR to the docs update or the issue with the details labeled docs. Remove if not applicable.

Requires explanation in Release Notes

  • Yes
  • No

If yes, fill in the details here. Remove if not applicable.

Remarks

Optional. Remove if not applicable.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes SSZ list merkleization in Merkleizer.Feed(...) to use the correct chunk limit during merkleization and the actual element count when mixing in length, aligning behavior with SSZ list semantics.

Changes:

  • Refactors Merkleizer.Feed(ReadOnlySpan<T>, limit) overloads to share a single implementation that computes chunkCount from the list limit and mixes in elementCount.
  • Updates list feed behavior to mix in the actual length rather than the limit.
  • Adds a regression test covering List[uint8, 64] with a single byte value.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
src/Nethermind/Nethermind.Serialization.Ssz.Test/MerkleTests.cs Adds a regression test ensuring list feeding uses chunk limit and actual length mix-in.
src/Nethermind/Nethermind.Merkleization/Merkleizer.cs Fixes list Feed logic by computing chunk limit correctly and mixing in actual element count; refactors to reduce duplication.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants