You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can write the log joint probability function as follows, where for the sake of simplicity for the following steps, we will assume that the `mu` and `tau2` parameters are one-element vectors. And `x` is the data.
44
44
45
-
```julia
45
+
```@example gibbs_example
46
+
using AbstractMCMC: AbstractMCMC, LogDensityProblems # hide
47
+
using Distributions # hide
48
+
using Random # hide
49
+
using AbstractMCMC: AbstractMCMC # hide
50
+
using AbstractPPL: AbstractPPL # hide
51
+
using BangBang: constructorof # hide
52
+
using AbstractPPL: AbstractPPL
53
+
```
54
+
55
+
```@example gibbs_example
46
56
function log_joint(; mu::Vector{Float64}, tau2::Vector{Float64}, x::Vector{Float64})
47
57
# mu is the mean
48
58
# tau2 is the variance
@@ -71,7 +81,7 @@ end
71
81
72
82
To make using `LogDensityProblems` interface, we create a simple type for this model.
# Interface 3: constructorof and MHState(state::MHState, logp::Float64)
170
180
# This function allows the state to be updated with a new log probability.
171
-
BangBang.constructorof(state::MHState{T}) where {T} = MHState
172
181
function MHState(state::MHState, logp::Float64)
173
182
return MHState(state.params, logp)
174
183
end
175
184
```
176
185
177
186
It is very simple to implement the samplers according to the `AbstractMCMC` interface, where we can use `LogDensityProblems.logdensity` to easily read the log probability of the current state.
178
187
179
-
```julia
188
+
```@example gibbs_example
180
189
"""
181
190
RandomWalkMH{T} <: AbstractMCMC.AbstractSampler
182
191
@@ -264,7 +273,7 @@ end
264
273
265
274
At last, we can proceed to implement a very simple Gibbs sampler.
0 commit comments