You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -70,10 +70,9 @@ Quantiles
70
70
71
71
### Usage with [`LogDensityProblems.jl`](https://github.com/tpapp/LogDensityProblems.jl)
72
72
73
-
It can also be used with models defining the [`LogDensityProblems.jl`](https://github.com/tpapp/LogDensityProblems.jl) interface by wrapping it in `AbstractMCMC.LogDensityModel` before passing it to `sample`:
73
+
Alternatively, you can define your model with the [`LogDensityProblems.jl`](https://github.com/tpapp/LogDensityProblems.jl) interface:
74
74
75
75
```julia
76
-
using AbstractMCMC: LogDensityModel
77
76
using LogDensityProblems
78
77
79
78
# Use a struct instead of `typeof(density)` for sake of readability.
@@ -83,7 +82,7 @@ LogDensityProblems.logdensity(p::LogTargetDensity, θ) = density(θ) # standard
### Usage with [`LogDensityProblemsAD.jl`](https://github.com/tpapp/LogDensityProblemsAD.jl)
182
+
### Usage with [`LogDensityProblems.jl`](https://github.com/tpapp/LogDensityProblems.jl)
185
183
186
-
Using our implementation of the `LogDensityProblems.jl` interface from earlier, we can use [`LogDensityProblemsAD.jl`](https://github.com/tpapp/LogDensityProblemsAD.jl) to provide us with the gradient computation used in MALA:
184
+
As above, we can define the model with the LogDensityProblems.jl interface.
185
+
We can implement the gradient of the log density function manually, or use [`LogDensityProblemsAD.jl`](https://github.com/tpapp/LogDensityProblemsAD.jl) to provide us with the gradient computation used in MALA.
186
+
Using our implementation of the `LogDensityProblems.jl` interface above:
0 commit comments