You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Turing v0.36 and removal of Pathfinder code (#579)
* Bump Turing.jl to 0.36 and update all mentions of Gibbs to match
* Replace Pathfinder example code with a link to Pathfinder docs
* Update Manifest.toml
* Bump Turing compat to 0.36.2
* Regenerate Manifest.toml with TuringBenchmarking.jl v0.5.8
* Update dependencies
* Host golf.dat file ourselves
---------
Co-authored-by: Penelope Yong <[email protected]>
The `Gibbs` sampler can be used to specify unique automatic differentiation backends for different variable spaces. Please see the [Automatic Differentiation]({{<metausing-turing-autodiff>}}) article for more.
Copy file name to clipboardExpand all lines: developers/compiler/design-overview/index.qmd
+1-2Lines changed: 1 addition & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -291,8 +291,7 @@ not. Let `md` be an instance of `Metadata`:
291
291
`md.vns`, `md.ranges`, `md.dists`, `md.orders` and `md.flags`.
292
292
-`md.vns[md.idcs[vn]] == vn`.
293
293
-`md.dists[md.idcs[vn]]` is the distribution of `vn`.
294
-
-`md.gids[md.idcs[vn]]` is the set of algorithms used to sample `vn`. This is used in
295
-
the Gibbs sampling process.
294
+
-`md.gids[md.idcs[vn]]` is the set of algorithms used to sample `vn`. This was used by the Gibbs sampler. Since Turing v0.36 it is unused and will eventually be deleted.
296
295
-`md.orders[md.idcs[vn]]` is the number of `observe` statements before `vn` is sampled.
297
296
-`md.ranges[md.idcs[vn]]` is the index range of `vn` in `md.vals`.
298
297
-`md.vals[md.ranges[md.idcs[vn]]]` is the linearized vector of values of corresponding to `vn`.
Copy file name to clipboardExpand all lines: tutorials/gaussian-mixture-models/index.qmd
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -112,7 +112,7 @@ model = gaussian_mixture_model(x);
112
112
```
113
113
114
114
We run a MCMC simulation to obtain an approximation of the posterior distribution of the parameters $\mu$ and $w$ and assignments $k$.
115
-
We use a `Gibbs` sampler that combines a [particle Gibbs](https://www.stats.ox.ac.uk/%7Edoucet/andrieu_doucet_holenstein_PMCMC.pdf) sampler for the discrete parameters (assignments $k$) and a Hamiltonion Monte Carlo sampler for the continuous parameters ($\mu$ and $w$).
115
+
We use a `Gibbs` sampler that combines a [particle Gibbs](https://www.stats.ox.ac.uk/%7Edoucet/andrieu_doucet_holenstein_PMCMC.pdf) sampler for the discrete parameters (assignments $k$) and a Hamiltonian Monte Carlo sampler for the continuous parameters ($\mu$ and $w$).
116
116
We generate multiple chains in parallel using multi-threading.
0 commit comments