You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/index.md
+4-2Lines changed: 4 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -34,9 +34,11 @@ Thus, any of the implemented [probabilities estimators](@ref estimators) can be
34
34
35
35
36
36
!!! tip "There aren't many entropies, really."
37
-
A crucial thing to clarify, is that many quantities that are named as entropies (e.g., permutation entropy ([`entropy_permutation`](@ref)), the wavelet entropy [`entropy_wavelet`](@ref), etc.), are _not really new entropies_. They are in fact new probability estimators. They simply devise a new way to calculate probabilities from data, and then plug those probabilities into formal entropy formulas such as the Shannon entropy. While in Entropies.jl we provide convenience functions like [`entropy_wavelet`](@ref), they really aren't anything more than 3-lines-of-code wrappers that call [`entropy_shannon`](@ref) with the appropriate [`ProbabilityEstimator`](@ref).
37
+
A crucial thing to clarify is that many quantities that are named as entropies (e.g., permutation entropy [`entropy_permutation`](@ref), wavelet entropy [`entropy_wavelet`](@ref), etc.), are _not really new entropies_. They are new probability estimators. They simply devise a new way to calculate probabilities from data, and then plug those probabilities into formal entropy formulas such as the Shannon entropy. The probability estimators are smartly created so that they elegantly highlight important aspects of the data relevant to complexity.
38
38
39
-
There are only a few exceptions to this rule, which are quantities that are able to compute Shannon entropies via alternate means, without explicitly computing some probability distributions, such as [TODO ADD EXAMPLE].
39
+
While in Entropies.jl we provide convenience functions like [`entropy_wavelet`](@ref), they really aren't anything more than 2-lines-of-code wrappers that call [`entropy_shannon`](@ref) with the appropriate [`ProbabilityEstimator`](@ref).
40
+
41
+
There are only a few exceptions to this rule, which are quantities that are able to compute Shannon entropies via alternate means, without explicitly computing some probability distributions, such as [`entropy_kraskov`](@ref).
0 commit comments