Skip to content

Commit 18f8ac4

Browse files
Changes as per @willtebbutt's suggestions
Co-authored-by: Will Tebbutt <[email protected]>
1 parent 75e0c5b commit 18f8ac4

File tree

2 files changed

+9
-7
lines changed

2 files changed

+9
-7
lines changed

developers/transforms/bijectors/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ However, we specified that the bijection $y = \exp(x)$ maps values of $x \in (-\
4646
:::
4747

4848
Specifically, one of the primary purposes of Bijectors.jl is used to construct _bijections which map constrained distributions to unconstrained ones_.
49-
For example, the log-normal distribution which we saw above is constrained: its _support_, i.e. the range over which $p(x) \geq 0$, is $(0, \infty)$.
49+
For example, the log-normal distribution which we saw above is constrained: its _support_, i.e. the range over which $p(x) > 0$, is $(0, \infty)$.
5050
However, we can transform that to an unconstrained distribution (the normal distribution) using the transformation $y = \log(x)$.
5151

5252
::: {.callout-note}

developers/transforms/distributions/index.qmd

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ histogram(samples, bins=50)
3636
```
3737

3838
(Calling `Normal()` without any arguments, as we do here, gives us a normal distribution with mean 0 and standard deviation 1.)
39-
If you want to know the probability of observing any of the samples, you can use `logpdf`:
39+
If you want to know the log probability density of observing any of the samples, you can use `logpdf`:
4040

4141
```{julia}
4242
println("sample: $(samples[1])")
@@ -53,6 +53,8 @@ so we could also have calculated this manually using:
5353
log(1 / sqrt(2π) * exp(-samples[1]^2 / 2))
5454
```
5555

56+
(or more efficiently, `-(samples[1]^2 + log2π) / 2`, where `log2π` is from the [IrrationalConstants.jl package](https://github.com/JuliaMath/IrrationalConstants.jl)).
57+
5658
## Sampling from a transformed distribution
5759

5860
Say that $x$ is distributed according to `Normal()`, and we want to draw samples of $y = \exp(x)$.
@@ -115,7 +117,7 @@ If we think about the normal distribution as a continuous curve, what the probab
115117

116118
$$\int_a^b p(x) \, \mathrm{d}x.$$
117119

118-
For example, if $(a, b) = (-\infty, \infty)$, then the probability of drawing a sample from the entire distribution is 1.
120+
For example, if $(a, b) = (-\infty, \infty)$, then the probability of drawing a sample between $a$ and $b$ is 1.
119121

120122
Let's say that the probability density function of the log-normal distribution is $q(y)$.
121123
Then, the area under the curve between the two points $\exp(a)$ and $\exp(b)$ is:
@@ -163,7 +165,7 @@ println("Expected : $(logpdf(LogNormal(), samples_lognormal[1]))")
163165
println("Actual : $(logpdf(MyLogNormal(), samples_lognormal[1]))")
164166
```
165167

166-
The same process can be applied to _any_ kind of transformation.
168+
The same process can be applied to any kind of (invertible) transformation.
167169
If we have some transformation from $x$ to $y$, and the probability density functions of $x$ and $y$ are $p(x)$ and $q(y)$ respectively, then we have a general formula that:
168170

169171
$$q(y) = p(x) \left| \frac{\mathrm{d}x}{\mathrm{d}y} \right|.$$
@@ -277,7 +279,7 @@ $$\frac{\partial x_2}{\partial y_1} = \frac{1}{2\pi} \left(\frac{1}{1 + (y_2/y_1
277279

278280
Putting together the Jacobian matrix, we have:
279281

280-
$$\mathcal{J} = \begin{pmatrix}
282+
$$\mathbf{J} = \begin{pmatrix}
281283
-y_1 x_1 & -y_2 x_1 \\
282284
-cy_2/y_1^2 & c/y_1 \\
283285
\end{pmatrix},$$
@@ -286,7 +288,7 @@ where $c = [2\pi(1 + (y_2/y_1)^2)]^{-1}$.
286288
The determinant of this matrix is
287289

288290
$$\begin{align}
289-
\det(\mathcal{J}) &= -cx_1 - cx_1(y_2/y_1)^2 \\
291+
\det(\mathbf{J}) &= -cx_1 - cx_1(y_2/y_1)^2 \\
290292
&= -cx_1\left[1 + \left(\frac{y_2}{y_1}\right)^2\right] \\
291293
&= -\frac{1}{2\pi} x_1 \\
292294
&= -\frac{1}{2\pi}\exp{\left(-\frac{y_1^2}{2}\right)}\exp{\left(-\frac{y_2^2}{2}\right)},
@@ -295,7 +297,7 @@ $$\begin{align}
295297
Coming right back to our probability density, we have that
296298

297299
$$\begin{align}
298-
q(y_1, y_2) &= p(x_1, x_2) \cdot |\det(\mathcal{J})| \\
300+
q(y_1, y_2) &= p(x_1, x_2) \cdot |\det(\mathbf{J})| \\
299301
&= \frac{1}{2\pi}\exp{\left(-\frac{y_1^2}{2}\right)}\exp{\left(-\frac{y_2^2}{2}\right)},
300302
\end{align}$$
301303

0 commit comments

Comments
 (0)