|
| 1 | + |
| 2 | +# [Meta Multivariate Samplers](@id meta) |
| 3 | +To use univariate slice sampling strategies on targets with more than on dimension, one has to embed them into a "meta" multivariate sampling scheme that relies on univariate sampling elements. |
| 4 | +The two most popular approaches for this are Gibbs sampling[^GG1984] and hit-and-run[^BRS1993]. |
| 5 | + |
| 6 | +## Random Permutation Gibbs |
| 7 | +Gibbs sampling[^GG1984] is a strategy where we sample from the posterior one coordinate at a time, conditioned on the values of all other coordinates. |
| 8 | +In practice, one can pick the coordinates in any order they want as long as it does not depend on the state of the chain. |
| 9 | +It is generally hard to know a-prior which "scan order" is best, but randomly picking coordinates tend to work well in general. |
| 10 | +Currently, we only provide random permutation scan, which guarantees that all coordinates are updated at least once after $$d$$ transitions. |
| 11 | +At the same time, reversibility is maintained by randomly permuting the order we go through each coordinate: |
| 12 | +```@docs |
| 13 | +RandPermGibbs |
| 14 | +``` |
| 15 | +Each call to `AbstractMCMC.step` internally performs $$d$$ Gibbs transition so that all coordinates are updated. |
| 16 | + |
| 17 | +For example: |
| 18 | +```julia |
| 19 | +RandPermGibbs(SliceSteppingOut(2.)) |
| 20 | +``` |
| 21 | + |
| 22 | +If one wants to use a different slice sampler configuration for each coordinate, one can mix-and-match by passing a `Vector` of slice samplers, one for each coordinate. |
| 23 | +For instance, for a 2-dimensional target: |
| 24 | +```julia |
| 25 | +RandPermGibbs([SliceSteppingOut(2.; max_proposals=32), SliceDoublingOut(2.),]) |
| 26 | +``` |
| 27 | + |
| 28 | +## Hit-and-Run |
| 29 | +Hit-and-run is a simple meta algorithm where we sample over a random 1-dimensional projection of the space. |
| 30 | +That is, at each iteration, we sample a random direction |
| 31 | +```math |
| 32 | + \theta_n \sim \operatorname{Uniform}(\mathbb{S}^{d-1}), |
| 33 | +``` |
| 34 | +and perform a Markov transition along the 1-dimensional subspace |
| 35 | +```math |
| 36 | +\begin{aligned} |
| 37 | + \lambda_n &\sim p\left(\lambda \mid x_{n-1}, \theta_n \right) \propto \pi\left( x_{n-1} + \lambda \theta_n \right) \\ |
| 38 | + x_{n} &= x_{n-1} + \lambda_n \theta_n, |
| 39 | +\end{aligned} |
| 40 | +``` |
| 41 | +where $$\pi$$ is the target unnormalized density. |
| 42 | +Applying slice sampling for the 1-dimensional subproblem has been popularized by David Mackay[^M2003], and is, technically, also a Gibbs sampler. |
| 43 | +(Or is that Gibbs samplers are hit-and-run samplers?) |
| 44 | +Unlike `RandPermGibbs`, which only makes axis-aligned moves, `HitAndRun` can choose arbitrary directions, which could be helpful in some cases. |
| 45 | + |
| 46 | +```@docs |
| 47 | +HitAndRun |
| 48 | +``` |
| 49 | + |
| 50 | +This can be used, for example, as follows: |
| 51 | + |
| 52 | +```julia |
| 53 | +HitAndRun(SliceSteppingOut(2.)) |
| 54 | +``` |
| 55 | +Unlike `RandPermGibbs`, `HitAndRun` does not provide the option of using a unique `unislice` object for each coordinate. |
| 56 | +This is a natural limitation of the hit-and-run sampler: it does not operate on individual coordinates. |
| 57 | + |
| 58 | +[^GG1984]: Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, (6). |
| 59 | +[^BRS1993]: Bélisle, C. J., Romeijn, H. E., & Smith, R. L. (1993). Hit-and-run algorithms for generating multivariate distributions. Mathematics of Operations Research, 18(2), 255-266. |
| 60 | +[^M2003]: MacKay, D. J. (2003). Information theory, inference and learning algorithms. Cambridge university press. |
0 commit comments