Skip to content

Commit fab6b4d

Browse files
authored
revert: typo
1 parent 4ad1456 commit fab6b4d

File tree

1 file changed

+1
-1
lines changed
  • public/content/developers/docs/data-availability

1 file changed

+1
-1
lines changed

public/content/developers/docs/data-availability/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ Data availability is also a critical concern for future ["stateless"](/roadmap/s
2626

2727
Data Availability Sampling (DAS) is a way for the network to check that data is available without putting too much strain on any individual node. Each node (including non-staking nodes) downloads some small, randomly selected subset of the total data. Successfully downloading the samples confirms with high confidence that all of the data is available. This relies upon data erasure coding, which expands a given dataset with redundant information (the way this is done is to fit a function known as a _polynomial_ over the data and evaluating that polynomial at additional points). This allows the original data to be recovered from the redundant data when necessary. A consequence of this data creation is that if _any_ of the original data is unavailable, _half_ of the expanded data will be missing! The amount of data samples downloaded by each node can be tuned so that it is _extremely_ likely that at least one of the data fragments sampled by each client will be missing _if_ less than half the data is really available.
2828

29-
DAS will to ensure rollup operators make their transaction data available after [Full Danksharding](/roadmap/danksharding/#what-is-danksharding) has been implemented. Ethereum nodes will randomly sample the transaction data provided in blobs using the redundancy scheme explained above to ensure that all the data exists. The same technique could also be employed to ensure block producers are making all their data available to secure light clients. Similarly, under [proposer-builder separation](/roadmap/pbs), only the block builder would be required to process an entire block - other validators would verify using data availability sampling.
29+
DAS will be used to ensure rollup operators make their transaction data available after [Full Danksharding](/roadmap/danksharding/#what-is-danksharding) has been implemented. Ethereum nodes will randomly sample the transaction data provided in blobs using the redundancy scheme explained above to ensure that all the data exists. The same technique could also be employed to ensure block producers are making all their data available to secure light clients. Similarly, under [proposer-builder separation](/roadmap/pbs), only the block builder would be required to process an entire block - other validators would verify using data availability sampling.
3030

3131
### Data availability committees {#data-availability-committees}
3232

0 commit comments

Comments
 (0)