|
1 | 1 |
|
2 | 2 | ## Why Does Krux Say the Entropy of My Fifty Dice Rolls Does Not Contain 128 Bits of Entropy? |
3 | | -This question, frequently raised in Krux chat groups, highlights the need to clarify the concepts and tools used by Krux to help users detect possible issues in the mnemonic creation procedure. Tools in Krux were designed to help users understand the concepts involved in the process, present statistics and indicators, and encourage users to experiment and evaluate results. This way, users learn about best practices in key generation. Below, we will dive deeper into entropy concepts to better support users in the fundamental requirement for sovereign self-custody, which is to build up knowledge. |
| 3 | +We want Krux to help users understand the concepts involved in the process, present statistics and indicators, and encourage users to experiment and evaluate results. This way, users learn about best practices in key generation. Below, we delve deeper into the concepts of entropy to better support users' knowledge of sovereign self-custody. |
4 | 4 |
|
5 | 5 | ## Entropy in Dice Rolls |
6 | 6 |
|
7 | 7 | Rolling dice and collecting the resulting values can be an effective method for generating cryptographic keys due to the inherent randomness and unpredictability of each roll. Each roll of a die produces a random number within a specific range, and when multiple rolls are combined, they create a sequence that is difficult to predict or reproduce. This sequence can be used to generate cryptographic keys that are robust against attacks. By ensuring that the dice rolls are conducted in a controlled and secure environment, and by using a sufficient number of rolls to achieve the desired level of randomness, one can create cryptographic keys that are highly secure and resistant to brute-force attacks or other forms of cryptanalysis. |
8 | 8 |
|
9 | 9 | ### Entropy Definitions |
10 | 10 |
|
11 | | -Entropy, a fundamental concept in various scientific disciplines, measures the degree of disorder or uncertainty within a system. This notion is interpreted differently across fields, leading to distinct types of entropy: mechanical entropy, Shannon's entropy, and cryptographic entropy. |
| 11 | +Entropy, a fundamental concept in various scientific disciplines, is most commonly associated with a state of disorder, randomness, or uncertainty within a system. We use the concepts from [thermodynamics entropy](https://en.wikipedia.org/wiki/Entropy_(classical_thermodynamics)), [Shannon's entropy](https://en.wikipedia.org/wiki/Entropy_(information_theory)), and [cryptographic entropy](https://en.wikipedia.org/wiki/Entropy_(computing)). |
12 | 12 |
|
13 | | -Mechanical entropy, rooted in thermodynamics and statistical mechanics, quantifies the disorder in a physical system. It describes how energy is distributed among the particles in a system, reflecting the system's tendency towards equilibrium and maximum disorder. |
| 13 | +- **Thermodynamics entropy** deals with heat and work. It describes how energy is distributed among the particles in a system, reflecting the system's tendency towards equilibrium and maximum disorder. |
14 | 14 |
|
15 | | -Shannon's entropy, from information theory, measures the uncertainty or information content in a message or data source. Introduced by Claude Shannon, it quantifies the average amount of information produced by a stochastic source of data, indicating how unpredictable the data is. |
| 15 | +- **Shannon's entropy**, from information theory, measures the uncertainty or information content in a message or data source. It quantifies the average amount of information produced by a stochastic source of data, indicating how unpredictable the data is. |
16 | 16 |
|
17 | | -Cryptographic entropy, crucial in security, refers to the unpredictability and randomness required for secure cryptographic keys and processes. High cryptographic entropy ensures that keys are difficult to predict or reproduce, providing robustness against attacks. |
| 17 | +- **Cryptographic entropy**, crucial in security, refers to the unpredictability and randomness required for secure cryptographic keys and processes. High cryptographic entropy ensures that keys are difficult to predict or reproduce, providing robustness against attacks. |
18 | 18 |
|
19 | | -While mechanical entropy deals with physical systems, Shannon's entropy focuses on information content, and cryptographic entropy emphasizes security through randomness. |
| 19 | +While thermodynamics entropy deals with physical systems, Shannon's entropy focuses on information content, and cryptographic entropy emphasizes security through randomness. |
20 | 20 |
|
21 | 21 | ### Measuring Dice Rolls Entropy |
22 | 22 | Entropy is a theoretical measure and is not directly measurable from a single roll but rather from the probability distribution of outcomes over many rolls. We can use Shannon's formula for theoretical and empirical calculations. Entropy $S$ can be quantified with: |
|
0 commit comments