Skip to content

Commit 6c590fe

Browse files
cleaning
1 parent ac4ae60 commit 6c590fe

File tree

9 files changed

+27
-137
lines changed

9 files changed

+27
-137
lines changed

Project.toml

Lines changed: 0 additions & 22 deletions
This file was deleted.

README.md

Lines changed: 0 additions & 68 deletions
This file was deleted.

tutorials/01-gaussian-mixture-model/Manifest.toml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1172,9 +1172,9 @@ version = "0.10.2"
11721172

11731173
[[deps.NNlib]]
11741174
deps = ["Adapt", "Atomix", "ChainRulesCore", "GPUArraysCore", "KernelAbstractions", "LinearAlgebra", "Pkg", "Random", "Requires", "Statistics"]
1175-
git-tree-sha1 = "e0cea7ec219ada9ac80ec2e82e374ab2f154ae05"
1175+
git-tree-sha1 = "3d4617f943afe6410206a5294a95948c8d1b35bd"
11761176
uuid = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
1177-
version = "0.9.16"
1177+
version = "0.9.17"
11781178

11791179
[deps.NNlib.extensions]
11801180
NNlibAMDGPUExt = "AMDGPU"
@@ -1196,9 +1196,9 @@ version = "1.0.2"
11961196

11971197
[[deps.NamedArrays]]
11981198
deps = ["Combinatorics", "DataStructures", "DelimitedFiles", "InvertedIndices", "LinearAlgebra", "Random", "Requires", "SparseArrays", "Statistics"]
1199-
git-tree-sha1 = "0ae91efac93c3859f5c812a24c9468bb9e50b028"
1199+
git-tree-sha1 = "c7aab3836df3f31591a2b4167fcd87b741dacfc9"
12001200
uuid = "86f7a689-2022-50b4-a561-43c23ac3c673"
1201-
version = "0.10.1"
1201+
version = "0.10.2"
12021202

12031203
[[deps.NaturalSort]]
12041204
git-tree-sha1 = "eda490d06b9f7c00752ee81cfa451efe55521e21"

tutorials/01-gaussian-mixture-model/index.qmd

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -112,12 +112,19 @@ setprogress!(false)
112112
```
113113

114114
```{julia}
115+
#| output: false
115116
sampler = Gibbs(PG(100, :k), HMC(0.05, 10, :μ, :w))
116117
nsamples = 100
117118
nchains = 3
118119
chains = sample(model, sampler, MCMCThreads(), nsamples, nchains);
119120
```
120121

122+
::: {.callout-warning collapse="true"}
123+
## Sampling With Multiple Threads
124+
The `sample()` call above assumes that you have at least `nchains` threads available in your Julia instance. If you do not, the multiple chains
125+
will run sequentially, and you may notice a warning. For more information, see [the Turing documentation on sampling multiple chains.](https://turinglang.org/dev/docs/using-turing/guide/#sampling-multiple-chains)
126+
:::
127+
121128
```{julia}
122129
#| echo: false
123130
let

tutorials/05-linear-regression/Manifest.toml

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -954,9 +954,9 @@ version = "3.0.0+1"
954954

955955
[[deps.LLVM]]
956956
deps = ["CEnum", "LLVMExtra_jll", "Libdl", "Preferences", "Printf", "Requires", "Unicode"]
957-
git-tree-sha1 = "839c82932db86740ae729779e610f07a1640be9a"
957+
git-tree-sha1 = "065c36f95709dd4a676dc6839a35d6fa6f192f24"
958958
uuid = "929cbde3-209d-540e-8aea-75f648917ca0"
959-
version = "6.6.3"
959+
version = "7.1.0"
960960

961961
[deps.LLVM.extensions]
962962
BFloat16sExt = "BFloat16s"
@@ -1257,9 +1257,9 @@ version = "0.10.2"
12571257

12581258
[[deps.NNlib]]
12591259
deps = ["Adapt", "Atomix", "ChainRulesCore", "GPUArraysCore", "KernelAbstractions", "LinearAlgebra", "Pkg", "Random", "Requires", "Statistics"]
1260-
git-tree-sha1 = "e0cea7ec219ada9ac80ec2e82e374ab2f154ae05"
1260+
git-tree-sha1 = "3d4617f943afe6410206a5294a95948c8d1b35bd"
12611261
uuid = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
1262-
version = "0.9.16"
1262+
version = "0.9.17"
12631263

12641264
[deps.NNlib.extensions]
12651265
NNlibAMDGPUExt = "AMDGPU"
@@ -1287,9 +1287,9 @@ version = "0.1.5"
12871287

12881288
[[deps.NamedArrays]]
12891289
deps = ["Combinatorics", "DataStructures", "DelimitedFiles", "InvertedIndices", "LinearAlgebra", "Random", "Requires", "SparseArrays", "Statistics"]
1290-
git-tree-sha1 = "0ae91efac93c3859f5c812a24c9468bb9e50b028"
1290+
git-tree-sha1 = "c7aab3836df3f31591a2b4167fcd87b741dacfc9"
12911291
uuid = "86f7a689-2022-50b4-a561-43c23ac3c673"
1292-
version = "0.10.1"
1292+
version = "0.10.2"
12931293

12941294
[[deps.NaturalSort]]
12951295
git-tree-sha1 = "eda490d06b9f7c00752ee81cfa451efe55521e21"
@@ -1693,9 +1693,9 @@ version = "1.2.1"
16931693

16941694
[[deps.SentinelArrays]]
16951695
deps = ["Dates", "Random"]
1696-
git-tree-sha1 = "363c4e82b66be7b9f7c7c7da7478fdae07de44b9"
1696+
git-tree-sha1 = "90b4f68892337554d31cdcdbe19e48989f26c7e6"
16971697
uuid = "91c51154-3ec4-41a3-a24f-3f23e20d615c"
1698-
version = "1.4.2"
1698+
version = "1.4.3"
16991699

17001700
[[deps.Serialization]]
17011701
uuid = "9e88b42a-f829-5b0c-bbe9-9e923198166b"
@@ -2029,9 +2029,9 @@ version = "0.2.1"
20292029

20302030
[[deps.UnsafeAtomicsLLVM]]
20312031
deps = ["LLVM", "UnsafeAtomics"]
2032-
git-tree-sha1 = "323e3d0acf5e78a56dfae7bd8928c989b4f3083e"
2032+
git-tree-sha1 = "d9f5962fecd5ccece07db1ff006fb0b5271bdfdd"
20332033
uuid = "d80eeb9a-aca5-4d75-85e5-170c8b632249"
2034-
version = "0.1.3"
2034+
version = "0.1.4"
20352035

20362036
[[deps.Unzip]]
20372037
git-tree-sha1 = "ca0969166a028236229f63514992fc073799bb78"

tutorials/05-linear-regression/index.qmd

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,6 @@ size(data)
6363
The next step is to get our data ready for testing. We'll split the `mtcars` dataset into two subsets, one for training our model and one for evaluating our model. Then, we separate the targets we want to learn (`MPG`, in this case) and standardize the datasets by subtracting each column's means and dividing by the standard deviation of that column. The resulting data is not very familiar looking, but this standardization process helps the sampler converge far easier.
6464

6565
```{julia}
66-
#| eval: false
6766
# Remove the model column.
6867
select!(data, Not(:Model))
6968
@@ -134,22 +133,19 @@ end
134133
With our model specified, we can call the sampler. We will use the No U-Turn Sampler ([NUTS](https://turinglang.org/stable/docs/library/#Turing.Inference.NUTS)) here.
135134

136135
```{julia}
137-
#| eval: false
138136
model = linear_regression(train, train_target)
139137
chain = sample(model, NUTS(), 5_000)
140138
```
141139

142140
We can also check the densities and traces of the parameters visually using the `plot` functionality.
143141

144142
```{julia}
145-
#| eval: false
146143
plot(chain)
147144
```
148145

149146
It looks like all parameters have converged.
150147

151148
```{julia}
152-
#| eval: false
153149
#| echo: false
154150
let
155151
ess_df = ess(chain)
@@ -164,7 +160,6 @@ end
164160
A satisfactory test of our model is to evaluate how well it predicts. Importantly, we want to compare our model to existing tools like OLS. The code below uses the [GLM.jl]() package to generate a traditional OLS multiple regression model on the same data as our probabilistic model.
165161

166162
```{julia}
167-
#| eval: false
168163
# Import the GLM package.
169164
using GLM
170165
@@ -185,7 +180,6 @@ StatsBase.reconstruct!(dt_targets, test_prediction_ols);
185180
The function below accepts a chain and an input matrix and calculates predictions. We use the samples of the model parameters in the chain starting with sample 200.
186181

187182
```{julia}
188-
#| eval: false
189183
# Make a prediction given an input vector.
190184
function prediction(chain, x)
191185
p = get_params(chain[200:end, :, :])
@@ -197,7 +191,6 @@ end
197191
When we make predictions, we unstandardize them so they are more understandable.
198192

199193
```{julia}
200-
#| eval: false
201194
# Calculate the predictions for the training and testing sets and unstandardize them.
202195
train_prediction_bayes = prediction(chain, train)
203196
StatsBase.reconstruct!(dt_targets, train_prediction_bayes)
@@ -215,7 +208,6 @@ $$
215208
where $y_i$ is the actual value (true MPG) and $\hat{y_i}$ is the predicted value using either OLS or Bayesian linear regression. A lower SSE indicates a closer fit to the data.
216209

217210
```{julia}
218-
#| eval: false
219211
println(
220212
"Training set:",
221213
"\n\tBayes loss: ",
@@ -234,7 +226,6 @@ println(
234226
```
235227

236228
```{julia}
237-
#| eval: false
238229
#| echo: false
239230
let
240231
bayes_train_loss = msd(train_prediction_bayes, trainset[!, target])

tutorials/docs-16-using-turing-external-samplers/Manifest.toml

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -924,9 +924,9 @@ version = "2.28.2+1"
924924

925925
[[deps.MicroCanonicalHMC]]
926926
deps = ["AbstractMCMC", "Adapt", "Distributions", "ForwardDiff", "HDF5", "LinearAlgebra", "LogDensityProblems", "LogDensityProblemsAD", "MCMCChains", "MCMCDiagnosticTools", "Markdown", "ProgressMeter", "Random", "Statistics"]
927-
git-tree-sha1 = "36c2a6c87b7fbb68b3be89d2ea07051aeb4f3690"
927+
git-tree-sha1 = "e05f95a8256fdf83632f4ea3742f7fb43038a100"
928928
uuid = "234d2aa0-2291-45f7-9047-6fa6f316b0a8"
929-
version = "0.1.3"
929+
version = "0.1.4"
930930

931931
[[deps.MicroCollections]]
932932
deps = ["Accessors", "BangBang", "InitialValues"]
@@ -961,9 +961,9 @@ version = "7.8.3"
961961

962962
[[deps.NNlib]]
963963
deps = ["Adapt", "Atomix", "ChainRulesCore", "GPUArraysCore", "KernelAbstractions", "LinearAlgebra", "Pkg", "Random", "Requires", "Statistics"]
964-
git-tree-sha1 = "e0cea7ec219ada9ac80ec2e82e374ab2f154ae05"
964+
git-tree-sha1 = "3d4617f943afe6410206a5294a95948c8d1b35bd"
965965
uuid = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
966-
version = "0.9.16"
966+
version = "0.9.17"
967967

968968
[deps.NNlib.extensions]
969969
NNlibAMDGPUExt = "AMDGPU"
@@ -985,9 +985,9 @@ version = "1.0.2"
985985

986986
[[deps.NamedArrays]]
987987
deps = ["Combinatorics", "DataStructures", "DelimitedFiles", "InvertedIndices", "LinearAlgebra", "Random", "Requires", "SparseArrays", "Statistics"]
988-
git-tree-sha1 = "0ae91efac93c3859f5c812a24c9468bb9e50b028"
988+
git-tree-sha1 = "c7aab3836df3f31591a2b4167fcd87b741dacfc9"
989989
uuid = "86f7a689-2022-50b4-a561-43c23ac3c673"
990-
version = "0.10.1"
990+
version = "0.10.2"
991991

992992
[[deps.NaturalSort]]
993993
git-tree-sha1 = "eda490d06b9f7c00752ee81cfa451efe55521e21"

tutorials/docs-16-using-turing-external-samplers/index.qmd

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,6 @@ This is achieved by simulating the dynamics of a microcanonical Hamiltonian with
102102
Using this as well as other inference methods outside the Turing ecosystem is as simple as executing the code shown below:
103103

104104
```{julia}
105-
#| eval: false
106105
using MicroCanonicalHMC
107106
# Create MCHMC sampler
108107
n_adapts = 1_000 # adaptation steps

weave_tutorials.jl

Lines changed: 0 additions & 17 deletions
This file was deleted.

0 commit comments

Comments
 (0)