You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The **density-from-moments** problem seeks to reconstruct an unknown probability density from a empirical moment measurements. When the density is supported on the unit interval, the problem is known as the _Haussdorf moment problem_[gzyl_HausdorffMomentProblem_2010, gzyl_SuperresolutionMaximumEntropy_2017, gzyl_SearchBestApproximant_2007](@cite).
9
+
10
+
The approach implemented in `DualPerspective.jl` and its submodule `DensityEstimation` is based on the **maximum entropy principle**: among all densities matching the observed moments, select the one with maximal entropy. This yields a robust, principled estimator that converges to the true density as the number of moments increases [borwein_ConvergenceBestEntropy_1991](@cite).
11
+
12
+
## Formulation
13
+
14
+
Here we describe the discrete case, where the density is supported on a finite set of locations $x = (x_1, ..., x_n)$. Given an unknown density $p = (p_1, ..., p_n)$ (with $p_j \geq 0$, $\sum_j p_j = 1$), the first $m$ moments are given by
15
+
16
+
```math
17
+
\mu_i = \sum_{j=1}^n x_j^i p_j, \quad i = 1,\ldots, m.
18
+
```
19
+
20
+
In practice, we don't observe the true moments $\mu_i$, but instead we might estimate these from samples $\{X^{(k)}\}_{k=1}^N$:
21
+
22
+
```math
23
+
\hat{\mu}_i = \frac{1}{N} \sum_{k=1}^N \left(X^{(k)}\right)^i, \quad i = 1,\ldots, m.
24
+
```
25
+
26
+
We then solve the **maximum entropy** problem
27
+
28
+
```math
29
+
\max_{p\in\Delta} \left\{
30
+
\textstyle\sum_j p_j \log p_j/q_j
31
+
\mid
32
+
\ A p \approx b
33
+
\right\}
34
+
```
35
+
36
+
where $\Delta$ is the set of discrete densities of length $n$, $A$ is the **moment operator** with entries $A_{ij} = x_j^i$, the $m$-vector $b$ collects the empirical moments $\hat{\mu}_i$, and $q\in\Delta$ is a reference density, i.e., a _prior_ on the unknown density $p$.
37
+
38
+
The function `reconstruct` solves this problem and returns the estimated density.
abstract = {Given a finite number of moments of an unknown density on a finite measure space, the best entropy estimate--that nonnegative density x with the given moments which minimizes the BoltzmannShannon entropy I(x):= x log x--is considered. A direct proof is given that I has the Kadec property in L--if Yn converges weakly to 37 and I(yn) converges to I(37), then yn converges to 37 in norm. As a corollary, it is obtained that, as the number of given moments increases, the best entropy estimates converge in LI norm to the best entropy estimate of the limiting problem, which is simply in the determined case. Furthermore, for classical moment problems on intervals with strictly positive and sufficiently smooth, error bounds and uniform convergence are actually obtained.},
abstract = {Hausdorff moment problem is considered and a solution, consisting of the use of fractional moments, is proposed. More precisely, in this work a stable algorithm to obtain centered moments from integer moments is found. The algorithm transforms a direct method into an iterative Jacobi method which converges in a finite number of steps, as the iteration Jacobi matrix has null spectral radius. The centered moments are needed to calculate fractional moments from integer moments. As an application few fractional moments are used to solve finite Hausdorff moment problem via maximum entropy technique. Fractional moments represent a remedy to ill-conditioning coming from an high number of integer moments involved in recovering procedure.},
78
74
keywords = {Fractional moments,Hankel matrices,Hausdorff moment problem,Maximum entropy,Moments space},
abstract = {In this work we want to show that, in general, a universal-best approximation does not exist, and that the choice of the approximant should be related to the context in which the approximation is needed. As we will clearly show, a good choice of approximant to compute expected values, does not necessarily remain the best choice when the context is changed, for example to approximately compute hazard rates.},
abstract = {The method of maximum entropy has proven to be a rather powerful way to solve the inverse problem consisting of determining a probability density fS(s) on [0,{$\infty$}) from the knowledge of the expected value of a few generalized moments, that is, of functions gi(S) of the variable S. A version of this problem, of utmost relevance for banking, insurance, engineering and the physical sciences, corresponds to the case in which S{$\geq$}0 and gi(s)=exp(-{$\alpha$}is), the expected values E[exp(-{$\alpha$}iS)] are the values of the Laplace transform of S the points {$\alpha$}i on the real line. Since inverting the Laplace transform is an ill-posed problem, to devise numerical techniques that are efficient is of importance for many applications, especially in cases where all we know is the value of the transform at a few points along the real axis. A simple change of variables transforms the Laplace inversion problem into a fractional moment problem on [0,~1]. In this note, we examine why the maximum entropy method provides a good approximation to the density from a few fractional moments, or correspondingly, why a density can be recovered from a few values of its Laplace transform along the real axis.},
0 commit comments