Skip to content

Potential interesting ESS improvement based on mixture of Gaussians #12

@yebai

Description

@yebai

The paper below proposes an interesting extension of ESS that handles models with non-Gaussian prior, and models with Gaussian priors but with informative likelihoods. The authors also motivate their algorithm for parallelisable implementation. If this work well in practice, it could be an interesting gradient-free alternative to the HMC/NUTS sampler for low to mid dimensional problems.

Nishihara, R., Murray, I., & Adams, R. P. (2014). Parallel MCMC with Generalized Elliptical Slice Sampling. Journal of Machine Learning Research: JMLR, 15(61), 2087–2112.

@imurray @robertnishihara

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions