Skip to content
Open
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
79 changes: 79 additions & 0 deletions .github/ISSUE_TEMPLATE/eip-tracker.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
name: "EIP Implementation Tracker"
description: "Track specification and testing progress for an EIP"
title: "EIP ${{ form.eip_number }} Progress Tracker"
labels:
- A-spec-specs
- A-spec-tests
- C-eip
- C-test

body:
- type: input
id: eip_number
attributes:
label: "EIP Number"
description: "Enter the EIP number (digits only)."
placeholder: "e.g., 8024"
validations:
required: true

- type: input
id: eip_title
attributes:
label: "EIP Title"
description: "Copy the title from the EIP."
placeholder: "e.g., Backwards compatible SWAPN, DUPN, EXCHANGE"
validations:
required: true

- type: dropdown
id: fork
attributes:
label: "Fork"
description: |
Specify the target fork **only if the EIP has reached the CFI stage**.
More info: https://eips.ethereum.org/EIPS/eip-7723#considered-for-inclusion
options:
- TBD
- amsterdam
- bogota
validations:
required: true

- type: markdown
attributes:
value: |
## EIP ${{ form.eip_number }}: ${{ form.eip_title }}

**Link:** https://eips.ethereum.org/EIPS/eip-${{ form.eip_number }}

## Target Fork

Fork **${{ form.fork }}**

- [ ] Add issue to fork milestone (if applicable).

## Ownership

Owner(s): **TBD**

**Important note:** A specifications specialist and a testing specialist should ideally share ownership of the EIP.

## Specification + Testing Status

- [ ] Testing complexity assessed and documented.
- [ ] Specification implementation merged to `eips/${{ form.fork }}/eip-${{ form.eip_number }}`.
- [ ] Specification updates merged to the corresponding `forks/${{ form.fork }}` branch.
- [ ] Required testing framework modifications implemented.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's important to document when we, either on our own or as a team, have had to make decisions about the architecture of an implementation which are not obvious/straightforward and explain why those choices were made. I'm thinking of cases where we have structurally altered the implementation due to code review comments or there were multiple good options for implementation.

Suggested change
- [ ] Specification implementation merged to `eips/${{ form.fork }}/eip-${{ form.eip_number }}`.
- [ ] Specification updates merged to the corresponding `forks/${{ form.fork }}` branch.
- [ ] Required testing framework modifications implemented.
- [ ] Specification implementation merged to `eips/${{ form.fork }}/eip-${{ form.eip_number }}`.
- [ ] Specification updates merged to the corresponding `forks/${{ form.fork }}` branch.
- [ ] Important architectural implementation choices documented
- [ ] Required testing framework modifications implemented.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I paraphrased a bit:

EIP updates proposed in case of architectural choices surfaced during implementation.

I used the word "proposed" here because we cannot guarantee a change goes into the EIP (technically, because most of the times the EIP process welcomes implementation feedback very well), just to not imply that we have to try to force this because it's outside of the EIP process.

- [ ] Test suite implemented.
- [ ] Full code coverage for all changes.
- [ ] Testing checklist complete
(https://github.com/ethereum/execution-specs/blob/HEAD/docs/writing_tests/checklist_templates/eip_testing_checklist_template.md).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- [ ] Testing checklist complete
(https://github.com/ethereum/execution-specs/blob/HEAD/docs/writing_tests/checklist_templates/eip_testing_checklist_template.md).
- [ ] [Testing checklist](https://github.com/ethereum/execution-specs/blob/HEAD/docs/writing_tests/checklist_templates/eip_testing_checklist_template.md) complete
.

- [ ] No regressions or failures in tests from prior forks (including static tests).
- [ ] Hardening session completed.
- [ ] Benchmarking performed and results documented.

## Process Status

- [ ] Hive tests passing on all implementations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might be nice to make this more granular.

Suggested change
- [ ] Hive tests passing on all implementations.
- [ ] Hive tests passing on all implementations:
| Client | Engine | RLP | Sync |
|-------------|--------|-----|------|
| Geth | [ ] | [ ] | [ ] |
| Nethermind | [ ] | [ ] | [ ] |
| Besu | [ ] | [ ] | [ ] |
| Erigon | [ ] | [ ] | [ ] |
| Reth | [ ] | [ ] | [ ] |
| Nimbus-EL | [ ] | [ ] | [ ] |
| Ethrex | [ ] | [ ] | [ ] |

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like this is too much, we could instead create another template.

But even this single bullet-point I'm not completely convinced we should put it in here because it's such a fluctuating state. E.g. we could introduce new tests that break implementations, or a client might have a regression. Do either of those mean we have to come back here and uncheck this checkbox?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After thinking about it, I feel we can make this point a bit more flexible:

Hive tests passing on at least two implementations.

Two implementations feels to me like the right balance to mark this as complete. And reasoning is:

  • Removing this requirement altogether would be prone to error in our specs/tests. Simply because sometimes the writing of the EIP can lead to multiple interpretations, and these are only surfaced after clients consume the generated tests.
  • Requiring only a single passing implementation is also too few since many times the second implementation brings better confirmation that the specs are correctly implemented.
  • While it’s true that a third or fourth implementation can still uncover issues, the problems discovered at that stage are beyond the EIP at that point, rather they are part of the completion of the fork itself.

cc @fselmo @danceratopz

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm good with "at least two" I think. Are we talking about a full implementation here where we believe the specs are now correct, we have enough tests for coverage, and there are at least two clients passing all of the tests? Or is this still within the development cycle?

It will be the case where specs are implemented, some clients implement it as well, and you start going back and forth on a consensus of what is implemented correctly and what is not. Then you tweak the specs and tests, tweak the client implementations, etc... so...

  • At what point in the process are we here?
  • With late additions to the EIP do we then come back here and uncheck this until it's back to being implemented, etc? I imagine this is how it works but just curious.

Also, if we go with this approach, we should have a section that tracks which clients have implemented it so we can signal this and ideally we should link to the branch in the relevant repos.

- [ ] EIP included in a devnet.
Loading