Skip to content

Commit 2bb57de

Browse files
authored
Merge branch 'master' into wct/improve-ad-docs
2 parents ff11aed + 0667406 commit 2bb57de

File tree

8 files changed

+49
-43
lines changed

8 files changed

+49
-43
lines changed

.github/workflows/publish.yml

Lines changed: 24 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,9 @@ concurrency:
1111
group: docs
1212
cancel-in-progress: true
1313

14+
permissions:
15+
contents: write
16+
1417
jobs:
1518
build-and-deploy:
1619
runs-on: ubuntu-latest
@@ -105,18 +108,26 @@ jobs:
105108
run: |
106109
jq -s '.[0] + .[1]' _site/search_original.json fixed_main_search.json > _site/search.json
107110
108-
- name: Deploy versioned docs
109-
uses: JamesIves/github-pages-deploy-action@v4
111+
- name: Checkout gh-pages branch
112+
uses: actions/checkout@v4
110113
with:
111-
branch: gh-pages
112-
folder: _site
113-
target-folder: versions/${{ env.version }}
114-
clean: false
114+
ref: gh-pages
115+
path: gh-pages
115116

116-
- name: Deploy latest docs to root
117-
if: env.version == env.LATEST
118-
uses: JamesIves/github-pages-deploy-action@v4
119-
with:
120-
branch: gh-pages
121-
folder: _site
122-
clean: false
117+
- name: Update gh-pages branch
118+
run: |
119+
# Copy to versions/ subdirectory
120+
mkdir -p gh-pages/versions/${{ env.version }}
121+
cp -r _site/* gh-pages/versions/${{ env.version }}
122+
123+
# Find the latest version of the docs and copy that to the root
124+
cd gh-pages/versions
125+
LATEST_DOCS=$(ls -d * | sort -V | tail -n 1)
126+
cp -r $LATEST_DOCS/* ../
127+
128+
# Commit and push
129+
git config --global user.name "github-actions[bot]"
130+
git config --global user.email "github-actions[bot]@users.noreply.github.com"
131+
git add -A
132+
git commit -m "Publish docs @ ${GITHUB_REPOSITORY}@${GITHUB_SHA}"
133+
git push

README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,10 @@ This repository is part of [Turing.jl's](https://turinglang.org/) website (i.e.
88

99
To get started with the docs website locally, you'll need to have [Quarto](https://quarto.org/docs/download/) installed.
1010
Make sure you have at least version 1.5 of Quarto installed, as this is required to correctly run [the native Julia engine](https://quarto.org/docs/computations/julia.html#using-the-julia-engine).
11+
Ideally, you should use Quarto 1.6.31 or later as this version fixes [a bug which causes random number generation between different cells to not be deterministic](https://github.com/TuringLang/docs/issues/533).
12+
Note that as of October 2024, Quarto 1.6 is a pre-release version, so you may need to install it from source rather than via a package manager like Homebrew.
1113

12-
Once you have the prerequisite installed, you can follow these steps:
14+
Once you have Quarto installed, you can follow these steps:
1315

1416
1. Clone this repository:
1517

_quarto.yml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -181,13 +181,13 @@ probabilistic-pca: tutorials/11-probabilistic-pca
181181
gplvm: tutorials/12-gplvm
182182
seasonal-time-series: tutorials/13-seasonal-time-series
183183
contexts: tutorials/16-contexts
184-
minituring: tutorial/14-minituring
184+
minituring: tutorials/14-minituring
185185
contributing-guide: tutorials/docs-01-contributing-guide
186186
using-turing-abstractmcmc: tutorials/docs-04-for-developers-abstractmc-turing
187187
using-turing-compiler: tutorials/docs-05-for-developers-compiler
188188
using-turing-interface: tutorials/docs-06-for-developers-interface
189189
using-turing-variational-inference: tutorials/docs-07-for-developers-variational-inference
190-
using-turing-advanced: tutorials/tutorials/docs-09-using-turing-advanced
190+
using-turing-advanced: tutorials/docs-09-using-turing-advanced
191191
using-turing-autodiff: tutorials/docs-10-using-turing-autodiff
192192
using-turing-dynamichmc: tutorials/docs-11-using-turing-dynamichmc
193193
using-turing: tutorials/docs-12-using-turing-guide
@@ -197,7 +197,7 @@ using-turing-external-samplers: tutorials/docs-16-using-turing-external-samplers
197197
using-turing-implementing-samplers: tutorials/docs-17-implementing-samplers
198198
using-turing-mode-estimation: tutorials/docs-17-mode-estimation
199199
usage-probability-interface: tutorials/usage-probability-interface
200-
usage-custom-distribution: tutorials/tutorials/usage-custom-distribution
201-
usage-generated-quantities: tutorials/tutorials/usage-generated-quantities
202-
usage-modifying-logprob: tutorials/tutorials/usage-modifying-logprob
200+
usage-custom-distribution: tutorials/usage-custom-distribution
201+
usage-generated-quantities: tutorials/usage-generated-quantities
202+
usage-modifying-logprob: tutorials/usage-modifying-logprob
203203
dev-model-manual: tutorials/dev-model-manual

tutorials/01-gaussian-mixture-model/index.qmd

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -142,8 +142,7 @@ let
142142
# μ[1] and μ[2] can switch places, so we sort the values first.
143143
chain = Array(chains[:, ["μ[1]", "μ[2]"], i])
144144
μ_mean = vec(mean(chain; dims=1))
145-
# TODO: https://github.com/TuringLang/docs/issues/533
146-
# @assert isapprox(sort(μ_mean), μ; rtol=0.1) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
145+
@assert isapprox(sort(μ_mean), μ; rtol=0.1) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
147146
end
148147
end
149148
```
@@ -208,8 +207,7 @@ let
208207
# μ[1] and μ[2] can no longer switch places. Check that they've found the mean
209208
chain = Array(chains[:, ["μ[1]", "μ[2]"], i])
210209
μ_mean = vec(mean(chain; dims=1))
211-
# TODO: https://github.com/TuringLang/docs/issues/533
212-
# @assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
210+
@assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
213211
end
214212
end
215213
```
@@ -349,8 +347,7 @@ let
349347
# μ[1] and μ[2] can no longer switch places. Check that they've found the mean
350348
chain = Array(chains[:, ["μ[1]", "μ[2]"], i])
351349
μ_mean = vec(mean(chain; dims=1))
352-
# TODO: https://github.com/TuringLang/docs/issues/533
353-
# @assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
350+
@assert isapprox(sort(μ_mean), μ; rtol=0.4) "Difference between estimated mean of μ ($(sort(μ_mean))) and data-generating μ ($μ) unexpectedly large!"
354351
end
355352
end
356353
```

tutorials/09-variational-inference/index.qmd

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -155,9 +155,8 @@ var(x), mean(x)
155155
#| echo: false
156156
let
157157
v, m = (mean(rand(q, 2000); dims=2)...,)
158-
# TODO: Fix these as they randomly fail https://github.com/TuringLang/docs/issues/533
159-
# @assert isapprox(v, 1.022; atol=0.1) "Mean of s (VI posterior, 1000 samples): $v"
160-
# @assert isapprox(m, -0.027; atol=0.03) "Mean of m (VI posterior, 1000 samples): $m"
158+
@assert isapprox(v, 1.022; atol=0.1) "Mean of s (VI posterior, 1000 samples): $v"
159+
@assert isapprox(m, -0.027; atol=0.03) "Mean of m (VI posterior, 1000 samples): $m"
161160
end
162161
```
163162

tutorials/11-probabilistic-pca/index.qmd

Lines changed: 8 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -246,13 +246,10 @@ heatmap(
246246
We can quantitatively check the absolute magnitudes of the column average of the gap between `mat_exp` and `mat_rec`:
247247

248248
```{julia}
249-
#| echo: false
250-
# let
251-
# diff_matrix = mat_exp .- mat_rec
252-
# @assert abs(mean(diff_matrix[:, 4])) <= 0.5 #0.327
253-
# @assert abs(mean(diff_matrix[:, 5])) <= 0.5 #0.390
254-
# @assert abs(mean(diff_matrix[:, 6])) <= 0.5 #0.326
255-
# end
249+
diff_matrix = mat_exp .- mat_rec
250+
for col in 4:6
251+
@assert abs(mean(diff_matrix[:, col])) <= 0.5
252+
end
256253
```
257254

258255
We observe that, using posterior mean, the recovered data matrix `mat_rec` has values align with the original data matrix - particularly the same pattern in the first and last 3 gene features are captured, which implies the inference and p-PCA decomposition are successful.
@@ -281,12 +278,12 @@ Another way to put it: 2 dimensions is enough to capture the main structure of t
281278
A direct question arises from above practice is: how many principal components do we want to keep, in order to sufficiently represent the latent structure in the data?
282279
This is a very central question for all latent factor models, i.e. how many dimensions are needed to represent that data in the latent space.
283280
In the case of PCA, there exist a lot of heuristics to make that choice.
284-
For example, We can tune the number of principal components using empirical methods such as cross-validation based some criteria such as MSE between the posterior predicted (e.g. mean predictions) data matrix and the original data matrix or the percentage of variation explained [3].
281+
For example, We can tune the number of principal components using empirical methods such as cross-validation based some criteria such as MSE between the posterior predicted (e.g. mean predictions) data matrix and the original data matrix or the percentage of variation explained [^3].
285282

286283
For p-PCA, this can be done in an elegant and principled way, using a technique called *Automatic Relevance Determination* (ARD).
287-
ARD can help pick the correct number of principal directions by regularizing the solution space using a parameterized, data-dependent prior distribution that effectively prunes away redundant or superfluous features [4].
284+
ARD can help pick the correct number of principal directions by regularizing the solution space using a parameterized, data-dependent prior distribution that effectively prunes away redundant or superfluous features [^4].
288285
Essentially, we are using a specific prior over the factor loadings $\mathbf{W}$ that allows us to prune away dimensions in the latent space. The prior is determined by a precision hyperparameter $\alpha$. Here, smaller values of $\alpha$ correspond to more important components.
289-
You can find more details about this in e.g. [5].
286+
You can find more details about this in, for example, Bishop (2006) [^5].
290287

291288
```{julia}
292289
@model function pPCA_ARD(X)
@@ -383,4 +380,4 @@ It can also thought as a matrix factorisation method, in which $\mathbf{X}=(\mat
383380
[^2]: Probabilistic PCA by TensorFlow, "https://www.tensorflow.org/probability/examples/Probabilistic_PCA".
384381
[^3]: Gareth M. James, Daniela Witten, Trevor Hastie, Robert Tibshirani, *An Introduction to Statistical Learning*, Springer, 2013.
385382
[^4]: David Wipf, Srikantan Nagarajan, *A New View of Automatic Relevance Determination*, NIPS 2007.
386-
[^5]: Christopher Bishop, *Pattern Recognition and Machine Learning*, Springer, 2006.
383+
[^5]: Christopher Bishop, *Pattern Recognition and Machine Learning*, Springer, 2006.

tutorials/16-contexts/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ In the [Mini Turing]({{< meta minituring >}}) tutorial we developed a miniature
1414

1515
# Mini Turing expanded, now with more contexts
1616

17-
If you haven't read [Mini Turing]({{< meta minituring >}}t) yet, you should do that first. We start by repeating verbatim much of the code from there. Define the type for holding values for variables:
17+
If you haven't read [Mini Turing]({{< meta minituring >}}) yet, you should do that first. We start by repeating verbatim much of the code from there. Define the type for holding values for variables:
1818

1919
```{julia}
2020
import MacroTools, Random, AbstractMCMC

tutorials/docs-01-contributing-guide/index.qmd

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,16 +27,16 @@ Other sections of the website (anything that isn't a package, or a tutorial) –
2727

2828
### Tests
2929

30-
Turing, like most software libraries, has a test suite. You can run the whole suite the usual Julia way with
30+
Turing, like most software libraries, has a test suite. You can run the whole suite by running `julia --project=.` from the root of the Turing repository, and then running
3131

3232
```julia
33-
Pkg.test("Turing")
33+
import Pkg; Pkg.test("Turing")
3434
```
3535

3636
The test suite subdivides into files in the `test` folder, and you can run only some of them using commands like
3737

3838
```julia
39-
Pkg.test("Turing"; test_args=["optim", "hmc", "--skip", "ext"])
39+
import Pkg; Pkg.test("Turing"; test_args=["optim", "hmc", "--skip", "ext"])
4040
```
4141

4242
This one would run all files with "optim" or "hmc" in their path, such as `test/optimisation/Optimisation.jl`, but not files with "ext" in their path. Alternatively, you can set these arguments as command line arguments when you run Julia

0 commit comments

Comments
 (0)