Skip to content

Commit 84f7953

Browse files
Tutorials added
1 parent 0668854 commit 84f7953

File tree

48 files changed

+36443
-465
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

48 files changed

+36443
-465
lines changed

tutorials/00-introduction/Manifest.toml

Lines changed: 2202 additions & 0 deletions
Large diffs are not rendered by default.
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
[deps]
2+
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
3+
MCMCChains = "c7f686f2-ff18-58e9-bc7b-31028e88f75d"
4+
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
5+
StatsPlots = "f3b207a7-027a-5e70-b257-86293d7955fd"
6+
Turing = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"

tutorials/00-introduction/index.qmd

Lines changed: 33 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -1,35 +1,27 @@
11
---
22
title: Introduction to Turing
3-
permalink: /tutorials/:name/
4-
redirect_from: tutorials/0-introduction/
5-
weave_options:
6-
error : false
3+
engine: julia
74
---
85

9-
## Introduction
6+
### Introduction
107

118
This is the first of a series of tutorials on the universal probabilistic programming language **Turing**.
129

13-
Turing is a probabilistic programming system written entirely in Julia.
14-
It has an intuitive modelling syntax and supports a wide range of sampling-based inference algorithms.
10+
Turing is a probabilistic programming system written entirely in Julia. It has an intuitive modelling syntax and supports a wide range of sampling-based inference algorithms.
1511

16-
Familiarity with Julia is assumed throughout this tutorial.
17-
If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.
12+
Familiarity with Julia is assumed throughout this tutorial. If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.
1813

19-
For users new to Bayesian machine learning, please consider more thorough introductions to the field such as [Pattern Recognition and Machine Learning](https://www.springer.com/us/book/9780387310732).
20-
This tutorial tries to provide an intuition for Bayesian inference and gives a simple example on how to use Turing.
21-
Note that this is not a comprehensive introduction to Bayesian machine learning.
14+
For users new to Bayesian machine learning, please consider more thorough introductions to the field such as [Pattern Recognition and Machine Learning](https://www.springer.com/us/book/9780387310732). This tutorial tries to provide an intuition for Bayesian inference and gives a simple example on how to use Turing. Note that this is not a comprehensive introduction to Bayesian machine learning.
2215

2316
### Coin Flipping Without Turing
2417

2518
The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.
2619

27-
Assume that we are unsure about the probability of heads in a coin flip.
28-
To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
20+
Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
2921

3022
First, let us load some packages that we need to simulate a coin flip
3123

32-
```julia
24+
```{julia}
3325
using Distributions
3426
3527
using Random
@@ -38,57 +30,47 @@ Random.seed!(12); # Set seed for reproducibility
3830

3931
and to visualize our results.
4032

41-
```julia
33+
```{julia}
4234
using StatsPlots
4335
```
4436

45-
Note that Turing is not loaded here — we do not use it in this example.
46-
If you are already familiar with posterior updates, you can proceed to the next step.
37+
Note that Turing is not loaded here — we do not use it in this example. If you are already familiar with posterior updates, you can proceed to the next step.
4738

48-
Next, we configure the data generating model.
49-
Let us set the true probability that a coin flip turns up heads
39+
Next, we configure the data generating model. Let us set the true probability that a coin flip turns up heads
5040

51-
```julia
41+
```{julia}
5242
p_true = 0.5;
5343
```
5444

5545
and set the number of coin flips we will show our model.
5646

57-
```julia
47+
```{julia}
5848
N = 100;
5949
```
6050

61-
We simulate `N` coin flips by drawing `N` random samples from the Bernoulli distribution with success probability `p_true`.
62-
The draws are collected in a variable called `data`:
51+
We simulate `N` coin flips by drawing N random samples from the Bernoulli distribution with success probability `p_true`. The draws are collected in a variable called `data`:
6352

64-
```julia
53+
```{julia}
6554
data = rand(Bernoulli(p_true), N);
6655
```
6756

6857
Here is what the first five coin flips look like:
6958

70-
```julia
59+
```{julia}
7160
data[1:5]
7261
```
7362

74-
Next, we specify a prior belief about the distribution of heads and tails in a coin toss.
75-
Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads.
76-
Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads.
77-
I.e., every probability is equally likely initially.
63+
Next, we specify a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.
7864

79-
```julia
65+
```{julia}
8066
prior_belief = Beta(1, 1);
8167
```
8268

8369
With our priors set and our data at hand, we can perform Bayesian inference.
8470

85-
This is a fairly simple process.
86-
We expose one additional coin flip to our model every iteration, such that the first run only sees the first coin flip, while the last iteration sees all the coin flips.
87-
In each iteration we update our belief to an updated version of the original Beta distribution that accounts for the new proportion of heads and tails.
88-
The update is particularly simple since our prior distribution is a [conjugate prior](https://en.wikipedia.org/wiki/Conjugate_prior).
89-
Note that a closed-form expression for the posterior (implemented in the `updated_belief` expression below) is not accessible in general and usually does not exist for more interesting models.
71+
This is a fairly simple process. We expose one additional coin flip to our model every iteration, such that the first run only sees the first coin flip, while the last iteration sees all the coin flips. In each iteration we update our belief to an updated version of the original Beta distribution that accounts for the new proportion of heads and tails. The update is particularly simple since our prior distribution is a [conjugate prior](https://en.wikipedia.org/wiki/Conjugate_prior). Note that a closed-form expression for the posterior (implemented in the `updated_belief` expression below) is not accessible in general and usually does not exist for more interesting models.
9072

91-
```julia
73+
```{julia}
9274
function updated_belief(prior_belief::Beta, data::AbstractArray{Bool})
9375
# Count the number of heads and tails.
9476
heads = sum(data)
@@ -143,19 +125,19 @@ We now move away from the closed-form expression above.
143125
We use **Turing** to specify the same model and to approximate the posterior distribution with samples.
144126
To do so, we first need to load `Turing`.
145127

146-
```julia
128+
```{julia}
147129
using Turing
148130
```
149131

150132
Additionally, we load `MCMCChains`, a library for analyzing and visualizing the samples with which we approximate the posterior distribution.
151133

152-
```julia
134+
```{julia}
153135
using MCMCChains
154136
```
155137

156138
First, we define the coin-flip model using Turing.
157139

158-
```julia
140+
```{julia}
159141
# Unconditioned coinflip model with `N` observations.
160142
@model function coinflip(; N::Int)
161143
# Our prior belief about the probability of heads in a coin toss.
@@ -174,15 +156,15 @@ The `@model` macro modifies the body of the Julia function `coinflip` and, e.g.,
174156

175157
Here we defined a model that is not conditioned on any specific observations as this allows us to easily obtain samples of both `p` and `y` with
176158

177-
```julia
159+
```{julia}
178160
rand(coinflip(; N))
179161
```
180162

181163
The model can be conditioned on some observations with `|`.
182164
See the [documentation of the `condition` syntax](https://turinglang.github.io/DynamicPPL.jl/stable/api/#Condition-and-decondition) in `DynamicPPL.jl` for more details.
183165
In the conditioned `model` the observations `y` are fixed to `data`.
184166

185-
```julia
167+
```{julia}
186168
coinflip(y::AbstractVector{<:Real}) = coinflip(; N=length(y)) | (; y)
187169
188170
model = coinflip(data);
@@ -192,31 +174,33 @@ After defining the model, we can approximate the posterior distribution by drawi
192174
In this example, we use a [Hamiltonian Monte Carlo](https://en.wikipedia.org/wiki/Hamiltonian_Monte_Carlo) sampler to draw these samples.
193175
Other tutorials give more information on the samplers available in Turing and discuss their use for different models.
194176

195-
```julia
177+
```{julia}
196178
sampler = NUTS();
197179
```
198180

199181
We approximate the posterior distribution with 1000 samples:
200182

201-
```julia
183+
```{julia}
202184
chain = sample(model, sampler, 1_000; progress=false);
203185
```
204186

205187
The `sample` function and common keyword arguments are explained more extensively in the documentation of [AbstractMCMC.jl](https://turinglang.github.io/AbstractMCMC.jl/dev/api/).
206188

207189
After finishing the sampling process, we can visually compare the closed-form posterior distribution with the approximation obtained with Turing.
208190

209-
```julia
191+
```{julia}
210192
histogram(chain)
211193
```
212194

213195
Now we can build our plot:
214196

215-
```julia; echo=false
197+
<!-- ```{julia}
198+
#| echo=false
199+
#| output: true
216200
@assert isapprox(mean(chain, :p), 0.5; atol=0.1) "Estimated mean of parameter p: $(mean(chain, :p)) - not in [0.4, 0.6]!"
217-
```
201+
``` -->
218202

219-
```julia
203+
```{julia}
220204
# Visualize a blue density plot of the approximate posterior distribution using HMC (see Chain 1 in the legend).
221205
density(chain; xlim=(0, 1), legend=:best, w=2, c=:blue)
222206
@@ -241,10 +225,4 @@ vline!([p_true]; label="True probability", c=:red)
241225

242226
As we can see, the samples obtained with Turing closely approximate the true posterior distribution.
243227
Hopefully this tutorial has provided an easy-to-follow, yet informative introduction to Turing's simpler applications.
244-
More advanced usage is demonstrated in other tutorials.
245-
246-
```julia, echo=false, skip="notebook", tangle=false
247-
if isdefined(Main, :TuringTutorials)
248-
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
249-
end
250-
```
228+
More advanced usage is demonstrated in other tutorials.

0 commit comments

Comments
 (0)