Skip to content

Conversation

@marioevz
Copy link
Member

@marioevz marioevz commented Dec 4, 2025

πŸ—’οΈ Description

Adds a tracker issue template for EIPs to help better monitor the specification and testing progress.

πŸ”— Related Issues or PRs

N/A.

βœ… Checklist

  • All: Ran fast tox checks to avoid unnecessary CI fails, see also Code Standards and Enabling Pre-commit Checks:
    uvx tox -e static
  • All: PR title adheres to the repo standard - it will be used as the squash commit message and should start type(scope):.
  • All: Considered adding an entry to CHANGELOG.md.
  • All: Considered updating the online docs in the ./docs/ directory.
  • All: Set appropriate labels for the changes (only maintainers can apply labels).
  • Tests: Ran mkdocs serve locally and verified the auto-generated docs for new tests in the Test Case Reference are correctly formatted.
  • Tests: For PRs implementing a missed test case, update the post-mortem document to add an entry the list.
  • Ported Tests: All converted JSON/YML tests from ethereum/tests or tests/static have been assigned @ported_from marker.

Cute Animal Picture

Put a link to a cute animal picture inside the parenthesis-->

@codecov
Copy link

codecov bot commented Dec 4, 2025

Codecov Report

βœ… All modified and coverable lines are covered by tests.
βœ… Project coverage is 87.31%. Comparing base (2b7dc12) to head (c19ad02).
⚠️ Report is 3 commits behind head on forks/osaka.

Additional details and impacted files
@@             Coverage Diff              @@
##           forks/osaka    #1847   +/-   ##
============================================
  Coverage        87.31%   87.31%           
============================================
  Files              541      541           
  Lines            32832    32832           
  Branches          3015     3015           
============================================
  Hits             28668    28668           
  Misses            3557     3557           
  Partials           607      607           
Flag Coverage Ξ”
unittests 87.31% <ΓΈ> (ΓΈ)

Flags with carried forward coverage won't be shown. Click here to find out more.

β˜” View full report in Codecov by Sentry.
πŸ“’ Have feedback on the report? Share it here.

πŸš€ New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • πŸ“¦ JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Copy link
Contributor

@Carsons-Eels Carsons-Eels left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some initial thoughts/suggestions

Comment on lines 65 to 67
- [ ] Specification implementation merged to `eips/${{ form.fork }}/eip-${{ form.eip_number }}`.
- [ ] Specification updates merged to the corresponding `forks/${{ form.fork }}` branch.
- [ ] Required testing framework modifications implemented.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's important to document when we, either on our own or as a team, have had to make decisions about the architecture of an implementation which are not obvious/straightforward and explain why those choices were made. I'm thinking of cases where we have structurally altered the implementation due to code review comments or there were multiple good options for implementation.

Suggested change
- [ ] Specification implementation merged to `eips/${{ form.fork }}/eip-${{ form.eip_number }}`.
- [ ] Specification updates merged to the corresponding `forks/${{ form.fork }}` branch.
- [ ] Required testing framework modifications implemented.
- [ ] Specification implementation merged to `eips/${{ form.fork }}/eip-${{ form.eip_number }}`.
- [ ] Specification updates merged to the corresponding `forks/${{ form.fork }}` branch.
- [ ] Important architectural implementation choices documented
- [ ] Required testing framework modifications implemented.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I paraphrased a bit:

EIP updates proposed in case of architectural choices surfaced during implementation.

I used the word "proposed" here because we cannot guarantee a change goes into the EIP (technically, because most of the times the EIP process welcomes implementation feedback very well), just to not imply that we have to try to force this because it's outside of the EIP process.

Comment on lines 70 to 71
- [ ] Testing checklist complete
(https://github.com/ethereum/execution-specs/blob/HEAD/docs/writing_tests/checklist_templates/eip_testing_checklist_template.md).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- [ ] Testing checklist complete
(https://github.com/ethereum/execution-specs/blob/HEAD/docs/writing_tests/checklist_templates/eip_testing_checklist_template.md).
- [ ] [Testing checklist](https://github.com/ethereum/execution-specs/blob/HEAD/docs/writing_tests/checklist_templates/eip_testing_checklist_template.md) complete
.

Copy link
Contributor

@spencer-tb spencer-tb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think adding a phase 1 (CFI'd) & phase 2 (SFI'd) to the Process Status section could be nice! Phase 1 would be for initial tests, and included in the devnet. Phase 2 would be for. spec/test coverage reviewed by 2 EELS / 2 EEST / 1 EIP author, extra tests added and passing, EIP successful in first testnet.

We could add a regressions section to. For significant changes, whether there are issues that come up during a devnet, missed test coverage or even whether the EIP is pulled out of the fork (and the reasons why).

Let me know what you think! Happy to merge now and get the process going nonetheless.


## Process Status

- [ ] Hive tests passing on all implementations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might be nice to make this more granular.

Suggested change
- [ ] Hive tests passing on all implementations.
- [ ] Hive tests passing on all implementations:
| Client | Engine | RLP | Sync |
|-------------|--------|-----|------|
| Geth | [ ] | [ ] | [ ] |
| Nethermind | [ ] | [ ] | [ ] |
| Besu | [ ] | [ ] | [ ] |
| Erigon | [ ] | [ ] | [ ] |
| Reth | [ ] | [ ] | [ ] |
| Nimbus-EL | [ ] | [ ] | [ ] |
| Ethrex | [ ] | [ ] | [ ] |

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like this is too much, we could instead create another template.

But even this single bullet-point I'm not completely convinced we should put it in here because it's such a fluctuating state. E.g. we could introduce new tests that break implementations, or a client might have a regression. Do either of those mean we have to come back here and uncheck this checkbox?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After thinking about it, I feel we can make this point a bit more flexible:

Hive tests passing on at least two implementations.

Two implementations feels to me like the right balance to mark this as complete. And reasoning is:

  • Removing this requirement altogether would be prone to error in our specs/tests. Simply because sometimes the writing of the EIP can lead to multiple interpretations, and these are only surfaced after clients consume the generated tests.
  • Requiring only a single passing implementation is also too few since many times the second implementation brings better confirmation that the specs are correctly implemented.
  • While it’s true that a third or fourth implementation can still uncover issues, the problems discovered at that stage are beyond the EIP at that point, rather they are part of the completion of the fork itself.

cc @fselmo @danceratopz

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm good with "at least two" I think. Are we talking about a full implementation here where we believe the specs are now correct, we have enough tests for coverage, and there are at least two clients passing all of the tests? Or is this still within the development cycle?

It will be the case where specs are implemented, some clients implement it as well, and you start going back and forth on a consensus of what is implemented correctly and what is not. Then you tweak the specs and tests, tweak the client implementations, etc... so...

  • At what point in the process are we here?
  • With late additions to the EIP do we then come back here and uncheck this until it's back to being implemented, etc? I imagine this is how it works but just curious.

Also, if we go with this approach, we should have a section that tracks which clients have implemented it so we can signal this and ideally we should link to the branch in the relevant repos.

> [!IMPORTANT]
> A specifications specialist and a testing specialist should ideally share ownership of the EIP.

- [ ] Add the issue to the target fork milestone if applicable (i.e., the EIP is at least in the (CFI stage)[https://eips.ethereum.org/EIPS/eip-7723#considered-for-inclusion]).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

broken link styling I think

Suggested change
- [ ] Add the issue to the target fork milestone if applicable (i.e., the EIP is at least in the (CFI stage)[https://eips.ethereum.org/EIPS/eip-7723#considered-for-inclusion]).
- [ ] Add the issue to the target fork milestone if applicable (i.e., the EIP is at least in the [CFI stage](https://eips.ethereum.org/EIPS/eip-7723#considered-for-inclusion)).


## Process Status

- [ ] Hive tests passing on all implementations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm good with "at least two" I think. Are we talking about a full implementation here where we believe the specs are now correct, we have enough tests for coverage, and there are at least two clients passing all of the tests? Or is this still within the development cycle?

It will be the case where specs are implemented, some clients implement it as well, and you start going back and forth on a consensus of what is implemented correctly and what is not. Then you tweak the specs and tests, tweak the client implementations, etc... so...

  • At what point in the process are we here?
  • With late additions to the EIP do we then come back here and uncheck this until it's back to being implemented, etc? I imagine this is how it works but just curious.

Also, if we go with this approach, we should have a section that tracks which clients have implemented it so we can signal this and ideally we should link to the branch in the relevant repos.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants