You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AdvancedHMC.jl provides a robust, modular, and efficient implementation of advanced HMC algorithms. An illustrative example of AdvancedHMC's usage is given below. AdvancedHMC.jl is part of [Turing.jl](https://github.com/TuringLang/Turing.jl), a probabilistic programming library in Julia.
11
-
If you are interested in using AdvancedHMC.jl through a probabilistic programming language, please check it out!
10
+
**AdvancedHMC.jl** provides robust, modular, and efficient implementation of advanced Hamiltonian Monte Carlo (HMC) algorithms in Julia. It is a backend for probabilistic programming languages like [Turing.jl](https://github.com/TuringLang/Turing.jl), but can also be used directly for flexible MCMC sampling when fine-grained control is desired.
12
11
13
-
## Hands on AdvancedHMC.jl
12
+
**Key Features**
14
13
15
-
Let's see how to sample a Hamiltonian using AdvanedHMC.jl
14
+
- Implementation of state-of-the-art [HMC variants](https://turinglang.org/AdvancedHMC.jl/dev/api/) (e.g., NUTS).
15
+
- The modular design allows for the customization of metrics, Hamiltonian trajectory simulation, and adaptation.
16
+
- Integration with the [LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl) interface for defining target distributions, and [LogDensityProblemsAD.jl](https://github.com/TuringLang/LogDensityProblemsAD.jl) for supporting automatic differentiation backends.
17
+
- Built upon the [AbstractMCMC.jl](https://github.com/TuringLang/AbstractMCMC.jl) interface for MCMC sampling.
18
+
19
+
## Installation
20
+
21
+
AdvancedHMC.jl is a registered Julia package. You can install it using the Julia package manager:
16
22
17
23
```julia
18
-
using AdvancedHMC, LogDensityProblems, ForwardDiff
19
-
# Define the target distribution using the `LogDensityProblem` interface
24
+
using Pkg
25
+
Pkg.add("AdvancedHMC")
26
+
```
27
+
28
+
## Quick Start: Sampling a Multivariate Normal
29
+
30
+
Here's a basic example demonstrating how to sample from a target distribution (a standard multivariate normal) using the No-U-Turn Sampler (NUTS).
31
+
32
+
```julia
33
+
using AdvancedHMC, AbstractMCMC
34
+
using LogDensityProblems, LogDensityProblemsAD, ADTypes # For defining the target distribution & its gradient
35
+
using ForwardDiff # An example AD backend
36
+
using Random # For initial parameters
37
+
38
+
# 1. Define the target distribution using the LogDensityProblems interface
20
39
struct LogTargetDensity
21
40
dim::Int
22
41
end
23
-
# standard multivariate normal distribution
42
+
#Log density of a standard multivariate normal distribution
# 2. Wrap the log density function and specify the AD backend.
54
+
# This creates a callable struct that computes the log density and its gradient.
55
+
ℓπ =LogTargetDensity(D)
56
+
model = AdvancedHMC.LogDensityModel(LogDensityProblemsAD.ADgradient(AutoForwardDiff(), ℓπ))
57
+
58
+
# 3. Set up the HMC sampler
59
+
# - Use the No-U-Turn Sampler (NUTS)
60
+
# - Specify the target acceptance probability (δ) for step size adaptation
61
+
sampler =NUTS(0.8) # Target acceptance probability δ=0.8
62
+
63
+
# Define the number of adaptation steps and sampling steps
64
+
n_adapts, n_samples =2_000, 1_000
65
+
66
+
# 4. Run the sampler!
67
+
# We use the AbstractMCMC.jl interface.
68
+
# Provide the model, sampler, total number of steps, and adaptation steps.
69
+
# An initial parameter vector `initial_θ` is also required.
70
+
initial_θ =randn(D)
71
+
72
+
samples = AbstractMCMC.sample(
73
+
Random.default_rng(),
74
+
model,
75
+
sampler,
76
+
n_adapts + n_samples;
77
+
n_adapts=n_adapts,
78
+
initial_params=initial_θ,
79
+
progress=true, # Optional: Show a progress bar
57
80
)
81
+
82
+
# `samples` now contains the MCMC chain. You can analyze it using packages
83
+
# like StatsPlots.jl, ArviZ.jl, or MCMCChains.jl.
58
84
```
59
85
86
+
For more advanced usage, please refer to the [docs](https://turinglang.org/AdvancedHMC.jl/dev/get_started/).
87
+
88
+
## Contributing
89
+
90
+
Contributions are highly welcome! If you find a bug, have a suggestion, or want to contribute code, please open an issue or pull request.
91
+
92
+
## License
93
+
94
+
AdvancedHMC.jl is licensed under the MIT License. See the LICENSE file for details.
95
+
60
96
## Citing AdvancedHMC.jl
61
97
62
98
If you use AdvancedHMC.jl for your own research, please consider citing the following publication:
@@ -76,7 +112,7 @@ with the following BibTeX entry:
76
112
}
77
113
```
78
114
79
-
If you using AdvancedHMC.jl directly through Turing.jl, please consider citing the following publication:
115
+
If you are using AdvancedHMC.jl directly through Turing.jl, please consider citing the following publication:
80
116
81
117
Hong Ge, Kai Xu, and Zoubin Ghahramani: "Turing: a language for flexible probabilistic inference.", *International Conference on Artificial Intelligence and Statistics*, 2018. ([abs](http://proceedings.mlr.press/v84/ge18b.html), [pdf](http://proceedings.mlr.press/v84/ge18b/ge18b.pdf))
82
118
@@ -96,7 +132,6 @@ with the following BibTeX entry:
96
132
## References
97
133
98
134
1. Neal, R. M. (2011). MCMC using Hamiltonian dynamics. Handbook of Markov chain Monte Carlo, 2(11), 2. ([arXiv](https://arxiv.org/pdf/1206.1901))
99
-
100
135
2. Betancourt, M. (2017). A Conceptual Introduction to Hamiltonian Monte Carlo. [arXiv preprint arXiv:1701.02434](https://arxiv.org/abs/1701.02434).
101
136
3. Girolami, M., & Calderhead, B. (2011). Riemann manifold Langevin and Hamiltonian Monte Carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 73(2), 123-214. ([arXiv](https://rss.onlinelibrary.wiley.com/doi/full/10.1111/j.1467-9868.2010.00765.x))
102
137
4. Betancourt, M. J., Byrne, S., & Girolami, M. (2014). Optimizing the integrator step size for Hamiltonian Monte Carlo. [arXiv preprint arXiv:1411.6669](https://arxiv.org/pdf/1411.6669).
0 commit comments