Skip to content

Commit 5ee48ad

Browse files
fix merge conflicts
2 parents ca9706c + 6df0913 commit 5ee48ad

File tree

21 files changed

+596
-906
lines changed

21 files changed

+596
-906
lines changed

LICENSE

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
MIT License
22

3-
Copyright (c) 2018-2024, Hong Ge, the Turing language team
3+
Copyright (c) 2018-2025, Hong Ge, the Turing language team
44

55
Permission is hereby granted, free of charge, to any person obtaining a copy
66
of this software and associated documentation files (the "Software"), to deal

Manifest.toml

Lines changed: 175 additions & 173 deletions
Large diffs are not rendered by default.

Project.toml

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ AbstractMCMC = "80f14c24-f653-4e6a-9b94-39d6b0f70001"
55
AbstractPPL = "7a57a42e-76ec-4ea3-a279-07e840d6d9cf"
66
AdvancedHMC = "0bf59076-c3b1-5ca4-86bd-e02cd72cde3d"
77
AdvancedMH = "5b7e9947-ddc0-4b3f-9b55-0d8042f74170"
8+
AdvancedVI = "b5ca4192-6429-45e5-a2d9-87aec30a685c"
89
Bijectors = "76274a88-744f-5084-9051-94815aaf08c4"
910
CSV = "336ed68f-0bac-5ca0-87d4-7b16caf5d00b"
1011
ComponentArrays = "b0b7db55-cfe3-40fc-9ded-d10e2dbeff66"
@@ -13,6 +14,7 @@ DataStructures = "864edb3b-99cc-5e75-8d2d-829cb0a9cfe8"
1314
DifferentialEquations = "0c46a032-eb83-5123-abaf-570d42b7fbaa"
1415
Distributed = "8ba89e20-285c-5b6f-9357-94700520ee1b"
1516
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
17+
DistributionsAD = "ced4e74d-a319-5a8a-b0ac-84af2272839c"
1618
DynamicHMC = "bbc10e6e-7c05-544b-b16e-64fede858acb"
1719
DynamicPPL = "366bfd00-2699-11ea-058f-f148b4cae6d8"
1820
FillArrays = "1a297f60-69ca-5386-bcde-b61e274b549b"
@@ -36,6 +38,7 @@ Memoization = "6fafb56a-5788-4b4e-91ca-c0cea6611c73"
3638
MicroCanonicalHMC = "234d2aa0-2291-45f7-9047-6fa6f316b0a8"
3739
Mooncake = "da2b9cff-9c12-43a0-ae48-6db2b0edb7d6"
3840
NNlib = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
41+
Optimisers = "3bd65402-5787-11e9-1adc-39752487f4e2"
3942
Optimization = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
4043
OptimizationNLopt = "4e6fcdb7-1186-4e1f-a706-475e75c168bb"
4144
OptimizationOptimJL = "36348300-93cb-4f02-beb5-3c3902f8871e"
@@ -51,7 +54,6 @@ StatsFuns = "4c63d2b9-4356-54db-8cca-17b64c39e42c"
5154
StatsPlots = "f3b207a7-027a-5e70-b257-86293d7955fd"
5255
Turing = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
5356
UnPack = "3a884ed6-31ef-47d7-9d2a-63182c4928ed"
54-
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
5557

5658
[compat]
57-
Turing = "0.38"
59+
Turing = "0.39"

_quarto.yml

Lines changed: 78 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -32,19 +32,19 @@ website:
3232
text: Team
3333
right:
3434
# Current version
35-
- text: "v0.38"
35+
- text: "v0.39"
3636
menu:
3737
- text: Changelog
3838
href: https://turinglang.org/docs/changelog.html
3939
- text: All Versions
4040
href: https://turinglang.org/docs/versions.html
4141
tools:
4242
- icon: twitter
43-
href: https://x.com/TuringLang
4443
text: Turing Twitter
44+
href: https://x.com/TuringLang
4545
- icon: github
46-
href: https://github.com/TuringLang/Turing.jl
4746
text: Turing GitHub
47+
href: https://github.com/TuringLang
4848

4949
sidebar:
5050
- text: documentation
@@ -116,12 +116,6 @@ website:
116116
- developers/inference/variational-inference/index.qmd
117117
- developers/inference/implementing-samplers/index.qmd
118118

119-
page-footer:
120-
#background: "#073c44"
121-
left: |
122-
Turing is created by <a href="http://mlg.eng.cam.ac.uk/hong/" target="_blank">Hong Ge</a>, and maintained by the <a href="/team" target="_blank">core team of developers</a>. <br>
123-
© 2025 under the terms of the <a href="https://github.com/TuringLang/Turing.jl/blob/master/LICENCE" target="_blank">MIT License</a>.
124-
125119
right:
126120
- icon: twitter
127121
href: https://x.com/TuringLang
@@ -149,43 +143,90 @@ format:
149143
toc-title: "Table of Contents"
150144
code-fold: false
151145
code-overflow: scroll
146+
include-in-header:
147+
- text: |
148+
<style>
149+
a {
150+
text-decoration: none;
151+
}
152+
a:hover {
153+
text-decoration: underline;
154+
}
155+
</style>
156+
include-after-body:
157+
- text: |
158+
<footer class="custom-footer">
159+
<div class="footer-container">
160+
<div class="footer-grid">
161+
<div class="footer-links-wrapper">
162+
<div class="footer-column footer-explore">
163+
<h5>Explore</h5>
164+
<a href="https://turinglang.org/docs/getting-started/">Get Started</a>
165+
<a href="https://turinglang.org/docs/tutorials/">Tutorials</a>
166+
<a href="/library">Libraries</a>
167+
<a href="/news">News</a>
168+
<a href="/team">Team</a>
169+
</div>
170+
171+
<div class="footer-column footer-connect">
172+
<h5>Connect</h5>
173+
<a href="https://github.com/TuringLang" target="_blank" rel="noopener"><i class="bi bi-github"></i> GitHub</a>
174+
<a href="https://x.com/TuringLang" target="_blank" rel="noopener"><i class="bi bi-twitter"></i> Twitter</a>
175+
<a href="https://julialang.slack.com/archives/CCYDC34A0" target="_blank" rel="noopener"><i class="bi bi-slack"></i> Slack</a>
176+
<a href="https://discourse.julialang.org/c/domain/probprog/48" target="_blank" rel="noopener"><i class="bi bi-chat-dots"></i> Discourse</a>
177+
</div>
178+
</div>
179+
180+
<div class="footer-column footer-brands">
181+
<h5>Supported by leading researchers</h5>
182+
<p>Turing.jl is developed by researchers and engineers at the following research institutions.</p>
183+
<div class="logo-grid">
184+
<a href="https://mlg.eng.cam.ac.uk/" class="partner-logo" target="_blank" rel="noopener">
185+
<img src="https://www.cam.ac.uk/sites/default/files/university-cambridge-logo-black-example-640x132.png" alt="University of Cambridge Logo" class="brands-light-mode-logo">
186+
<img src="https://www.cam.ac.uk/sites/default/files/university-cambridge-logo-white-example-640x133.png" alt="University of Cambridge Logo Dark" class="brands-dark-mode-logo">
187+
</a>
188+
<a href="https://www.turing.ac.uk/" class="partner-logo" target="_blank" rel="noopener">
189+
<img src="/assets/images/brands/Turing_Logo_1000x400px_Black.webp" alt="The Alan Turing Institute Logo" class="brands-light-mode-logo">
190+
<img src="/assets/images/brands/Turing_Logo_1000x400px_White.webp" alt="The Alan Turing Institute Logo Dark" class="brands-dark-mode-logo">
191+
</a>
192+
</div>
193+
</div>
194+
195+
</div>
196+
<div class="footer-bottom">
197+
<p>Turing is created by <a href="https://mlg.eng.cam.ac.uk/hong/" target="_blank" rel="noopener">Hong Ge</a>, and maintained by the core <a href="/team" target="_blank" rel="noopener">team</a> of developers and <a href="https://github.com/TuringLang/Turing.jl/graphs/contributors" target="_blank" rel="noopener">contributors</a>!<br>© 2025 The Turing Project Contributors. <a href="https://github.com/TuringLang/Turing.jl/blob/master/LICENCE" target="_blank" rel="noopener">MIT License</a>.</p>
198+
<a href="https://github.com/TuringLang/turinglang.github.io/" target="_blank" rel="noopener" class="footer-source-link"><i class="bi bi-github"></i> Website Source</a>
199+
</div>
200+
</div>
201+
</footer>
202+
152203
execute:
153204
echo: true
154205
output: true
155206
freeze: auto
156-
include-in-header:
157-
- text: |
158-
<style>
159-
a {
160-
text-decoration: none;
161-
}
162-
a:hover {
163-
text-decoration: underline;
164-
}
165-
</style>
166207

167208
# These variables can be used in any qmd files, e.g. for links:
168209
# the [Getting Started page]({{< meta get-started >}})
169210
# Note that you don't need to prepend `../../` to the link, Quarto will figure
170211
# it out automatically.
171212

172-
get-started: tutorials/docs-00-getting-started
173-
tutorials-intro: tutorials/00-introduction
174-
gaussian-mixture-model: tutorials/01-gaussian-mixture-model
175-
logistic-regression: tutorials/02-logistic-regression
176-
bayesian-neural-network: tutorials/03-bayesian-neural-network
177-
hidden-markov-model: tutorials/04-hidden-markov-model
178-
linear-regression: tutorials/05-linear-regression
179-
infinite-mixture-model: tutorials/06-infinite-mixture-model
180-
poisson-regression: tutorials/07-poisson-regression
181-
multinomial-logistic-regression: tutorials/08-multinomial-logistic-regression
182-
variational-inference: tutorials/09-variational-inference
183-
bayesian-differential-equations: tutorials/10-bayesian-differential-equations
184-
probabilistic-pca: tutorials/11-probabilistic-pca
185-
gplvm: tutorials/12-gplvm
186-
seasonal-time-series: tutorials/13-seasonal-time-series
187-
using-turing-advanced: tutorials/docs-09-using-turing-advanced
188-
using-turing: tutorials/docs-12-using-turing-guide
213+
core-functionality: core-functionality
214+
get-started: getting-started
215+
216+
tutorials-intro: tutorials/coin-flipping
217+
gaussian-mixture-model: tutorials/gaussian-mixture-models
218+
logistic-regression: tutorials/bayesian-logistic-regression
219+
bayesian-neural-network: tutorials/bayesian-neural-networks
220+
hidden-markov-model: tutorials/hidden-markov-models
221+
linear-regression: tutorials/bayesian-linear-regression
222+
infinite-mixture-model: tutorials/infinite-mixture-models
223+
poisson-regression: tutorials/bayesian-poisson-regression
224+
multinomial-logistic-regression: tutorials/multinomial-logistic-regression
225+
variational-inference: tutorials/variational-inference
226+
bayesian-differential-equations: tutorials/bayesian-differential-equations
227+
probabilistic-pca: tutorials/probabilistic-pca
228+
gplvm: tutorials/gaussian-process-latent-variable-models
229+
seasonal-time-series: tutorials/bayesian-time-series-analysis
189230

190231
usage-automatic-differentiation: usage/automatic-differentiation
191232
usage-custom-distribution: usage/custom-distribution
@@ -204,7 +245,7 @@ dev-model-manual: developers/compiler/model-manual
204245
contexts: developers/compiler/minituring-contexts
205246
minituring: developers/compiler/minituring-compiler
206247
using-turing-compiler: developers/compiler/design-overview
207-
using-turing-variational-inference: developers/inference/variational-inference
248+
dev-variational-inference: developers/inference/variational-inference
208249
using-turing-implementing-samplers: developers/inference/implementing-samplers
209250
dev-transforms-distributions: developers/transforms/distributions
210251
dev-transforms-bijectors: developers/transforms/bijectors

developers/compiler/minituring-contexts/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -294,7 +294,7 @@ Of course, using an MCMC algorithm to sample from the prior is unnecessary and s
294294
The use of contexts also goes far beyond just evaluating log probabilities and sampling. Some examples from Turing are
295295

296296
* `FixedContext`, which fixes some variables to given values and removes them completely from the evaluation of any log probabilities. They power the `Turing.fix` and `Turing.unfix` functions.
297-
* `ConditionContext` conditions the model on fixed values for some parameters. They are used by `Turing.condition` and `Turing.uncondition`, i.e. the `model | (parameter=value,)` syntax. The difference between `fix` and `condition` is whether the log probability for the corresponding variable is included in the overall log density.
297+
* `ConditionContext` conditions the model on fixed values for some parameters. They are used by `Turing.condition` and `Turing.decondition`, i.e. the `model | (parameter=value,)` syntax. The difference between `fix` and `condition` is whether the log probability for the corresponding variable is included in the overall log density.
298298

299299
* `PriorExtractorContext` collects information about what the prior distribution of each variable is.
300300
* `PrefixContext` adds prefixes to variable names, allowing models to be used within other models without variable name collisions.

developers/compiler/model-manual/index.qmd

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -36,26 +36,26 @@ using DynamicPPL
3636
function gdemo2(model, varinfo, context, x)
3737
# Assume s² has an InverseGamma distribution.
3838
s², varinfo = DynamicPPL.tilde_assume!!(
39-
context, InverseGamma(2, 3), Turing.@varname(s²), varinfo
39+
context, InverseGamma(2, 3), @varname(s²), varinfo
4040
)
4141
4242
# Assume m has a Normal distribution.
4343
m, varinfo = DynamicPPL.tilde_assume!!(
44-
context, Normal(0, sqrt(s²)), Turing.@varname(m), varinfo
44+
context, Normal(0, sqrt(s²)), @varname(m), varinfo
4545
)
4646
4747
# Observe each value of x[i] according to a Normal distribution.
4848
for i in eachindex(x)
4949
_retval, varinfo = DynamicPPL.tilde_observe!!(
50-
context, Normal(m, sqrt(s²)), x[i], Turing.@varname(x[i]), varinfo
50+
context, Normal(m, sqrt(s²)), x[i], @varname(x[i]), varinfo
5151
)
5252
end
5353
5454
# The final return statement should comprise both the original return
5555
# value and the updated varinfo.
5656
return nothing, varinfo
5757
end
58-
gdemo2(x) = Turing.Model(gdemo2, (; x))
58+
gdemo2(x) = DynamicPPL.Model(gdemo2, (; x))
5959
6060
# Instantiate a Model object with our data variables.
6161
model2 = gdemo2([1.5, 2.0])

developers/inference/implementing-samplers/index.qmd

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -403,11 +403,11 @@ As we promised, all of this hassle of implementing our `MALA` sampler in a way t
403403
It also enables use with Turing.jl through the `externalsampler`, but we need to do one final thing first: we need to tell Turing.jl how to extract a vector of parameters from the "sample" returned in our implementation of `AbstractMCMC.step`. In our case, the "sample" is a `MALASample`, so we just need the following line:
404404

405405
```{julia}
406-
# Load Turing.jl.
407406
using Turing
407+
using DynamicPPL
408408
409409
# Overload the `getparams` method for our "sample" type, which is just a vector.
410-
Turing.Inference.getparams(::Turing.Model, sample::MALASample) = sample.x
410+
Turing.Inference.getparams(::DynamicPPL.Model, sample::MALASample) = sample.x
411411
```
412412

413413
And with that, we're good to go!

getting-started/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,5 +92,5 @@ The underlying theory of Bayesian machine learning is not explained in detail in
9292
A thorough introduction to the field is [*Pattern Recognition and Machine Learning*](https://www.springer.com/us/book/9780387310732) (Bishop, 2006); an online version is available [here (PDF, 18.1 MB)](https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf).
9393
:::
9494

95-
The next page on [Turing's core functionality]({{<meta using-turing>}}) explains the basic features of the Turing language.
95+
The next page on [Turing's core functionality]({{<meta core-functionality>}}) explains the basic features of the Turing language.
9696
From there, you can either look at [worked examples of how different models are implemented in Turing]({{<meta tutorials-intro>}}), or [specific tips and tricks that can help you get the most out of Turing]({{<meta usage-performance-tips>}}).

tutorials/bayesian-differential-equations/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -344,7 +344,7 @@ import Mooncake
344344
import SciMLSensitivity
345345
346346
# Define the AD backend to use
347-
adtype = AutoMooncake(; config=nothing)
347+
adtype = AutoMooncake()
348348
349349
# Sample a single chain with 1000 samples using Mooncake
350350
sample(model, NUTS(; adtype=adtype), 1000; progress=false)

tutorials/bayesian-neural-networks/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -210,7 +210,7 @@ setprogress!(false)
210210
```{julia}
211211
# Perform inference.
212212
n_iters = 2_000
213-
ch = sample(bayes_nn(reduce(hcat, xs), ts), NUTS(; adtype=AutoMooncake(; config=nothing)), n_iters);
213+
ch = sample(bayes_nn(reduce(hcat, xs), ts), NUTS(; adtype=AutoMooncake()), n_iters);
214214
```
215215

216216
Now we extract the parameter samples from the sampled chain as `θ` (this is of size `5000 x 20` where `5000` is the number of iterations and `20` is the number of parameters).

0 commit comments

Comments
 (0)