Simulating and fitting determinantally-thinned Poisson point processes.
A determinantally-thinned Poisson point process is a discrete determinantal point process whose underlying state space is a single realization of a Poisson point process defined on some bounded continuous space. This is a repulsive point process, where the repulsion depends on the kernel and average density of points.
This new type of point process was originally proposed in the paper by Blaszczyszyn and Keeler:
B. Blaszczyszyn and H.P. Keeler, "Determinantal thinning of point processes with network learning applications," 2018. https://arxiv.org/abs/1810.08672
An obvious question is whether a determinantally-thinned Poisson point process is also a determinantal point process? The answer, we believe, is no, but it's far from obvious.
detpoisson/
├── detpoisson_matlab/ # MATLAB implementation (full workflow)
├── detpoisson_python/ # Python implementation
├── detpoisson_julia/ # Julia implementation
├── detpoisson_r/ # R implementation
├── detpoisson-article.pdf # Paper with full details
└── LICENSE # Apache 2.0
Run DemoDetPoisson in your preferred language to simulate a single determinantally-thinned Poisson point process:
MATLAB:
cd detpoisson_matlab
DemoDetPoissonPython:
cd detpoisson_python
python DemoDetPoisson.pyJulia:
include("detpoisson_julia/DemoDetPoisson.jl")R:
source("detpoisson_r/DemoDetPoisson.R")The MATLAB files can reproduce the numerical results from the paper. The workflow fits determinantally-thinned Poisson point processes to dependently-thinned Poisson point processes such as Matern hard-core point processes.
-
Generate training data: Run
SubsetGenerate.mto create Matern I/II or Triangle hard-core point processes. Saves results toSubset.mat. -
Fit the model: Run
SubsetDetPoissonFit.mto fit a determinantal thinning model using maximum likelihood. Saves parameters toSubsetFitParam.mat. -
Validate the fit: Run
SubsetDetPoissonGenerate.mto compare the fitted model against the original process using nearest-neighbour and contact distributions.
To reproduce the exact results from the paper, set the random seed to one by uncommenting:
seedRand=1; rng(seedRand)in SubsetGenerate.m and SubsetDetPoissonGenerate.m.
The determinantal point process is defined via an L-ensemble matrix where:
- L matrix: Positive semi-definite matrix defining the point process
- K kernel: Related to L by K = L(I + L)^(-1), with eigenvalues in [0,1]
The L matrix decomposes as L[x,y] = q_x * S[x,y] * q_y where:
- Quality (q_x): Measures the "goodness" of point x (higher = more likely to be retained)
- Similarity (S[x,y]): Measures repulsion between points (higher = less likely both appear)
The code uses the algorithm from Hough, Krishnapur, Peres and Virag (popularized by Kulesza and Taskar):
- Eigendecompose L to get eigenvalues and eigenvectors
- Transform eigenvalues: λ_K = λ_L / (1 + λ_L)
- Perform Bernoulli trials with eigenvalues as success probabilities
- Iteratively select points and orthonormalize the remaining subspace
- Wireless network modeling: Base station layouts exhibiting repulsion
- Sleep/power schemes: Coordinated transmitter on-off patterns
- Transmission scheduling: Medium access control with geometric dependencies
-
B. Blaszczyszyn and H.P. Keeler, "Determinantal thinning of point processes with network learning applications," 2018. https://arxiv.org/abs/1810.08672
-
J.B. Hough, M. Krishnapur, Y. Peres, and B. Virag, "Determinantal processes and independence," Probability Surveys, 2006.
-
A. Kulesza and B. Taskar, "Determinantal point processes for machine learning," Foundations and Trends in Machine Learning, 2012.
This project is licensed under the Apache License 2.0. See LICENSE for details.