Skip to content

Commit b647662

Browse files
yebaigithub-actions[bot]ErikQQY
authored
Simpler worked example in README.md (#421)
* Update README.md * Update README.md * Update README.md * Apply suggestions from code review Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * Update README.md * Update README.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * Update README.md * Update README.md Co-authored-by: Qingyu Qu <[email protected]> * Update README.md * Update README.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> --------- Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: Qingyu Qu <[email protected]>
1 parent 511d6ec commit b647662

File tree

1 file changed

+71
-36
lines changed

1 file changed

+71
-36
lines changed

README.md

Lines changed: 71 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -2,61 +2,97 @@
22

33
[![CI](https://github.com/TuringLang/AdvancedHMC.jl/actions/workflows/CI.yml/badge.svg)](https://github.com/TuringLang/AdvancedHMC.jl/actions/workflows/CI.yml)
44
[![DOI](https://zenodo.org/badge/72657907.svg)](https://zenodo.org/badge/latestdoi/72657907)
5-
[![Coverage Status](https://coveralls.io/repos/github/TuringLang/AdvancedHMC.jl/badge.svg?branch=kx%2Fbug-fix)](https://coveralls.io/github/TuringLang/AdvancedHMC.jl?branch=kx%2Fbug-fix)
5+
[![Coverage Status](https://coveralls.io/repos/github/TuringLang/AdvancedHMC.jl/badge.svg?branch=main)](https://coveralls.io/github/TuringLang/AdvancedHMC.jl?branch=main)
66
[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://turinglang.github.io/AdvancedHMC.jl/stable/)
77
[![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://turinglang.github.io/AdvancedHMC.jl/dev/)
88
[![Aqua QA](https://raw.githubusercontent.com/JuliaTesting/Aqua.jl/master/badge.svg)](https://github.com/JuliaTesting/Aqua.jl)
99

10-
AdvancedHMC.jl provides a robust, modular, and efficient implementation of advanced HMC algorithms. An illustrative example of AdvancedHMC's usage is given below. AdvancedHMC.jl is part of [Turing.jl](https://github.com/TuringLang/Turing.jl), a probabilistic programming library in Julia.
11-
If you are interested in using AdvancedHMC.jl through a probabilistic programming language, please check it out!
10+
**AdvancedHMC.jl** provides robust, modular, and efficient implementation of advanced Hamiltonian Monte Carlo (HMC) algorithms in Julia. It is a backend for probabilistic programming languages like [Turing.jl](https://github.com/TuringLang/Turing.jl), but can also be used directly for flexible MCMC sampling when fine-grained control is desired.
1211

13-
## Hands on AdvancedHMC.jl
12+
**Key Features**
1413

15-
Let's see how to sample a Hamiltonian using AdvanedHMC.jl
14+
- Implementation of state-of-the-art [HMC variants](https://turinglang.org/AdvancedHMC.jl/dev/api/) (e.g., NUTS).
15+
- The modular design allows for the customization of metrics, Hamiltonian trajectory simulation, and adaptation.
16+
- Integration with the [LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl) interface for defining target distributions, and [LogDensityProblemsAD.jl](https://github.com/TuringLang/LogDensityProblemsAD.jl) for supporting automatic differentiation backends.
17+
- Built upon the [AbstractMCMC.jl](https://github.com/TuringLang/AbstractMCMC.jl) interface for MCMC sampling.
18+
19+
## Installation
20+
21+
AdvancedHMC.jl is a registered Julia package. You can install it using the Julia package manager:
1622

1723
```julia
18-
using AdvancedHMC, LogDensityProblems, ForwardDiff
19-
# Define the target distribution using the `LogDensityProblem` interface
24+
using Pkg
25+
Pkg.add("AdvancedHMC")
26+
```
27+
28+
## Quick Start: Sampling a Multivariate Normal
29+
30+
Here's a basic example demonstrating how to sample from a target distribution (a standard multivariate normal) using the No-U-Turn Sampler (NUTS).
31+
32+
```julia
33+
using AdvancedHMC, AbstractMCMC
34+
using LogDensityProblems, LogDensityProblemsAD, ADTypes # For defining the target distribution & its gradient
35+
using ForwardDiff # An example AD backend
36+
using Random # For initial parameters
37+
38+
# 1. Define the target distribution using the LogDensityProblems interface
2039
struct LogTargetDensity
2140
dim::Int
2241
end
23-
# standard multivariate normal distribution
42+
# Log density of a standard multivariate normal distribution
2443
LogDensityProblems.logdensity(p::LogTargetDensity, θ) = -sum(abs2, θ) / 2
2544
LogDensityProblems.dimension(p::LogTargetDensity) = p.dim
45+
# Declare that the log density function is defined
2646
function LogDensityProblems.capabilities(::Type{LogTargetDensity})
2747
return LogDensityProblems.LogDensityOrder{0}()
2848
end
2949

30-
D = 10; # parameter dimensionality
31-
initial_θ = rand(D); # initial parameter value
32-
ℓπ = LogTargetDensity(D)
50+
# Set parameter dimensionality
51+
D = 10
3352

34-
# Set the number of samples to draw and warmup iterations
35-
n_samples, n_adapts = 2_000, 1_000
36-
37-
# Define a Hamiltonian system
38-
metric = DiagEuclideanMetric(D)
39-
hamiltonian = Hamiltonian(metric, ℓπ, ForwardDiff)
40-
41-
# Define a leapfrog solver, with the initial step size chosen heuristically
42-
initial_ϵ = find_good_stepsize(hamiltonian, initial_θ)
43-
integrator = Leapfrog(initial_ϵ)
44-
45-
# Define an HMC sampler with the following components
46-
# - multinomial sampling scheme,
47-
# - generalised No-U-Turn criteria, and
48-
# - windowed adaption for step-size and diagonal mass matrix
49-
kernel = HMCKernel(Trajectory{MultinomialTS}(integrator, GeneralisedNoUTurn()))
50-
adaptor = StanHMCAdaptor(MassMatrixAdaptor(metric), StepSizeAdaptor(0.8, integrator))
51-
52-
# Run the sampler to draw samples from the specified Gaussian, where
53-
# - `samples` will store the samples
54-
# - `stats` will store diagnostic statistics for each sample
55-
samples, stats = sample(
56-
hamiltonian, kernel, initial_θ, n_samples, adaptor, n_adapts; progress=true
53+
# 2. Wrap the log density function and specify the AD backend.
54+
# This creates a callable struct that computes the log density and its gradient.
55+
ℓπ = LogTargetDensity(D)
56+
model = AdvancedHMC.LogDensityModel(LogDensityProblemsAD.ADgradient(AutoForwardDiff(), ℓπ))
57+
58+
# 3. Set up the HMC sampler
59+
# - Use the No-U-Turn Sampler (NUTS)
60+
# - Specify the target acceptance probability (δ) for step size adaptation
61+
sampler = NUTS(0.8) # Target acceptance probability δ=0.8
62+
63+
# Define the number of adaptation steps and sampling steps
64+
n_adapts, n_samples = 2_000, 1_000
65+
66+
# 4. Run the sampler!
67+
# We use the AbstractMCMC.jl interface.
68+
# Provide the model, sampler, total number of steps, and adaptation steps.
69+
# An initial parameter vector `initial_θ` is also required.
70+
initial_θ = randn(D)
71+
72+
samples = AbstractMCMC.sample(
73+
Random.default_rng(),
74+
model,
75+
sampler,
76+
n_adapts + n_samples;
77+
n_adapts=n_adapts,
78+
initial_params=initial_θ,
79+
progress=true, # Optional: Show a progress bar
5780
)
81+
82+
# `samples` now contains the MCMC chain. You can analyze it using packages
83+
# like StatsPlots.jl, ArviZ.jl, or MCMCChains.jl.
5884
```
5985

86+
For more advanced usage, please refer to the [docs](https://turinglang.org/AdvancedHMC.jl/dev/get_started/).
87+
88+
## Contributing
89+
90+
Contributions are highly welcome! If you find a bug, have a suggestion, or want to contribute code, please open an issue or pull request.
91+
92+
## License
93+
94+
AdvancedHMC.jl is licensed under the MIT License. See the LICENSE file for details.
95+
6096
## Citing AdvancedHMC.jl
6197

6298
If you use AdvancedHMC.jl for your own research, please consider citing the following publication:
@@ -76,7 +112,7 @@ with the following BibTeX entry:
76112
}
77113
```
78114

79-
If you using AdvancedHMC.jl directly through Turing.jl, please consider citing the following publication:
115+
If you are using AdvancedHMC.jl directly through Turing.jl, please consider citing the following publication:
80116

81117
Hong Ge, Kai Xu, and Zoubin Ghahramani: "Turing: a language for flexible probabilistic inference.", *International Conference on Artificial Intelligence and Statistics*, 2018. ([abs](http://proceedings.mlr.press/v84/ge18b.html), [pdf](http://proceedings.mlr.press/v84/ge18b/ge18b.pdf))
82118

@@ -96,7 +132,6 @@ with the following BibTeX entry:
96132
## References
97133

98134
1. Neal, R. M. (2011). MCMC using Hamiltonian dynamics. Handbook of Markov chain Monte Carlo, 2(11), 2. ([arXiv](https://arxiv.org/pdf/1206.1901))
99-
100135
2. Betancourt, M. (2017). A Conceptual Introduction to Hamiltonian Monte Carlo. [arXiv preprint arXiv:1701.02434](https://arxiv.org/abs/1701.02434).
101136
3. Girolami, M., & Calderhead, B. (2011). Riemann manifold Langevin and Hamiltonian Monte Carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 73(2), 123-214. ([arXiv](https://rss.onlinelibrary.wiley.com/doi/full/10.1111/j.1467-9868.2010.00765.x))
102137
4. Betancourt, M. J., Byrne, S., & Girolami, M. (2014). Optimizing the integrator step size for Hamiltonian Monte Carlo. [arXiv preprint arXiv:1411.6669](https://arxiv.org/pdf/1411.6669).

0 commit comments

Comments
 (0)