You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Turing is created by <a href="http://mlg.eng.cam.ac.uk/hong/" target="_blank">Hong Ge</a>, and maintained by the <a href="/team" target="_blank">core team of developers</a>. <br>
<img src="https://www.cam.ac.uk/sites/default/files/university-cambridge-logo-black-example-640x132.png" alt="University of Cambridge Logo" class="brands-light-mode-logo">
186
+
<img src="https://www.cam.ac.uk/sites/default/files/university-cambridge-logo-white-example-640x133.png" alt="University of Cambridge Logo Dark" class="brands-dark-mode-logo">
Copy file name to clipboardExpand all lines: developers/compiler/minituring-contexts/index.qmd
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -294,7 +294,7 @@ Of course, using an MCMC algorithm to sample from the prior is unnecessary and s
294
294
The use of contexts also goes far beyond just evaluating log probabilities and sampling. Some examples from Turing are
295
295
296
296
*`FixedContext`, which fixes some variables to given values and removes them completely from the evaluation of any log probabilities. They power the `Turing.fix` and `Turing.unfix` functions.
297
-
*`ConditionContext` conditions the model on fixed values for some parameters. They are used by `Turing.condition` and `Turing.uncondition`, i.e. the `model | (parameter=value,)` syntax. The difference between `fix` and `condition` is whether the log probability for the corresponding variable is included in the overall log density.
297
+
*`ConditionContext` conditions the model on fixed values for some parameters. They are used by `Turing.condition` and `Turing.decondition`, i.e. the `model | (parameter=value,)` syntax. The difference between `fix` and `condition` is whether the log probability for the corresponding variable is included in the overall log density.
298
298
299
299
*`PriorExtractorContext` collects information about what the prior distribution of each variable is.
300
300
*`PrefixContext` adds prefixes to variable names, allowing models to be used within other models without variable name collisions.
Copy file name to clipboardExpand all lines: developers/inference/implementing-samplers/index.qmd
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -403,11 +403,11 @@ As we promised, all of this hassle of implementing our `MALA` sampler in a way t
403
403
It also enables use with Turing.jl through the `externalsampler`, but we need to do one final thing first: we need to tell Turing.jl how to extract a vector of parameters from the "sample" returned in our implementation of `AbstractMCMC.step`. In our case, the "sample" is a `MALASample`, so we just need the following line:
404
404
405
405
```{julia}
406
-
# Load Turing.jl.
407
406
using Turing
407
+
using DynamicPPL
408
408
409
409
# Overload the `getparams` method for our "sample" type, which is just a vector.
Copy file name to clipboardExpand all lines: getting-started/index.qmd
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -92,5 +92,5 @@ The underlying theory of Bayesian machine learning is not explained in detail in
92
92
A thorough introduction to the field is [*Pattern Recognition and Machine Learning*](https://www.springer.com/us/book/9780387310732) (Bishop, 2006); an online version is available [here (PDF, 18.1 MB)](https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf).
93
93
:::
94
94
95
-
The next page on [Turing's core functionality]({{<metausing-turing>}}) explains the basic features of the Turing language.
95
+
The next page on [Turing's core functionality]({{<metacore-functionality>}}) explains the basic features of the Turing language.
96
96
From there, you can either look at [worked examples of how different models are implemented in Turing]({{<metatutorials-intro>}}), or [specific tips and tricks that can help you get the most out of Turing]({{<metausage-performance-tips>}}).
Now we extract the parameter samples from the sampled chain as `θ` (this is of size `5000 x 20` where `5000` is the number of iterations and `20` is the number of parameters).
0 commit comments