You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: HISTORY.md
+39-5Lines changed: 39 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,14 @@
1
1
# 0.42.0
2
2
3
-
## Breaking Changes
3
+
## **AdvancedVI 0.6**
4
4
5
-
**AdvancedVI 0.5**
6
-
7
-
Turing.jl v0.42 updates `AdvancedVI.jl` compatibility to 0.5.
8
-
Most of the changes introduced in `[email protected]` are structural, with some changes spilling out into the interface.
5
+
Turing.jl v0.42 updates `AdvancedVI.jl` compatibility to 0.6 (we skipped the breaking 0.5 update as it does not introduce new features).
6
+
`[email protected]` introduces major structural changes including breaking changes to the interface and multiple new features.
9
7
The summary of the changes below are the things that affect the end-users of Turing.
10
8
For a more comprehensive list of changes, please refer to the [changelogs](https://github.com/TuringLang/AdvancedVI.jl/blob/main/HISTORY.md) in `AdvancedVI`.
11
9
10
+
### Breaking Changes
11
+
12
12
A new level of interface for defining different variational algorithms has been introduced in `AdvancedVI` v0.5. As a result, the function `Turing.vi` now receives a keyword argument `algorithm`. The object `algorithm <: AdvancedVI.AbstractVariationalAlgorithm` should now contain all the algorithm-specific configurations. Therefore, keyword arguments of `vi` that were algorithm-specific such as `objective`, `operator`, `averager` and so on, have been moved as fields of the relevant `<: AdvancedVI.AbstractVariationalAlgorithm` structs.
13
13
For example,
14
14
@@ -49,6 +49,40 @@ Additionally,
49
49
50
50
- The default hyperparameters of `DoG`and `DoWG` have been altered.
-`estimate_objective` now returns the value to be minimized by the optimization algorithm. For example, for ELBO maximization algorithms, `estimate_objective` will return the *negative ELBO*. This is breaking change from the previous behavior where the ELBO was returns.
53
+
54
+
### New Features
55
+
56
+
`[email protected]` adds numerous new features including the following new VI algorithms:
57
+
58
+
-`KLMinWassFwdBwd`: Also known as "Wasserstein variational inference," this algorithm minimizes the KL divergence under the Wasserstein-2 metric.
59
+
-`KLMinNaturalGradDescent`: This algorithm, also known as "online variational Newton," is the canonical "black-box" natural gradient variational inference algorithm, which minimizes the KL divergence via mirror descent under the KL divergence as the Bregman divergence.
60
+
-`KLMinSqrtNaturalGradDescent`: This is a recent variant of `KLMinNaturalGradDescent` that operates in the Cholesky-factor parameterization of Gaussians instead of precision matrices.
61
+
-`FisherMinBatchMatch`: This algorithm called "batch-and-match," minimizes the variation of the 2nd order fisher divergence via a proximal point-type algorithm.
62
+
63
+
Any of the new algorithms above can readily be used by simply swappin the `algorithm` keyword argument of `vi`.
64
+
For example, to use batch-and-match:
65
+
```julia
66
+
vi(model, q, n_iters; algorithm=FisherMinBatchMatch())
67
+
```
68
+
69
+
# 0.41.1
70
+
71
+
The `ModeResult` struct returned by `maximum_a_posteriori` and `maximum_likelihood` can now be wrapped in `InitFromParams()`.
72
+
This makes it easier to use the parameters in downstream code, e.g. when specifying initial parameters for MCMC sampling.
If you need to access the dictionary of parameters, it is stored in `opt_result.params` but note that this field may change in future breaking releases as that Turing's optimisation interface is slated for overhaul in the near future.
<ahref="https://github.com/SciML/ColPrac"><imgsrc="https://img.shields.io/badge/ColPrac-Contributor%27s%20Guide-blueviolet"alt="ColPrac: Contributor's Guide on Collaborative Practices for Community Packages" /></a>
10
14
</p>
11
15
12
-
## 🚀 Get started
16
+
## Get started
13
17
14
18
Install Julia (see [the official Julia website](https://julialang.org/install/); you will need at least Julia 1.10 for the latest version of Turing.jl).
15
19
Then, launch a Julia REPL and run:
@@ -23,22 +27,29 @@ You can define models using the `@model` macro, and then perform Markov chain Mo
23
27
```julia
24
28
julia>using Turing
25
29
26
-
julia>@modelfunctionmy_first_model(data)
27
-
mean ~Normal(0, 1)
28
-
sd ~truncated(Cauchy(0, 3); lower=0)
29
-
data ~Normal(mean, sd)
30
+
julia>@modelfunctionlinear_regression(x)
31
+
# Priors
32
+
α ~Normal(0, 1)
33
+
β ~Normal(0, 1)
34
+
σ² ~truncated(Cauchy(0, 3); lower=0)
35
+
36
+
# Likelihood
37
+
μ = α .+ β .* x
38
+
y ~MvNormal(μ, σ² * I)
30
39
end
31
40
32
-
julia>model =my_first_model(randn())
41
+
julia>x, y =rand(10), rand(10)
33
42
34
-
julia> chain =sample(model, NUTS(), 1000)
43
+
julia> posterior =linear_regression(x) | (; y = y)
44
+
45
+
julia> chain =sample(posterior, NUTS(), 1000)
35
46
```
36
47
37
48
You can find the main TuringLang documentation at [**https://turinglang.org**](https://turinglang.org), which contains general information about Turing.jl's features, as well as a variety of tutorials with examples of Turing.jl models.
38
49
39
50
API documentation for Turing.jl is specifically available at [**https://turinglang.org/Turing.jl/stable**](https://turinglang.org/Turing.jl/stable/).
40
51
41
-
## 🛠️ Contributing
52
+
## Contributing
42
53
43
54
### Issues
44
55
@@ -55,20 +66,20 @@ Breaking releases (minor version) should target the `breaking` branch.
55
66
56
67
If you have not received any feedback on an issue or PR for a while, please feel free to ping `@TuringLang/maintainers` in a comment.
57
68
58
-
## 💬 Other channels
69
+
## Other channels
59
70
60
71
The Turing.jl userbase tends to be most active on the [`#turing` channel of Julia Slack](https://julialang.slack.com/archives/CCYDC34A0).
61
72
If you do not have an invitation to Julia's Slack, you can get one from [the official Julia website](https://julialang.org/slack/).
62
73
63
74
There are also often threads on [Julia Discourse](https://discourse.julialang.org) (you can search using, e.g., [the `turing` tag](https://discourse.julialang.org/tag/turing)).
64
75
65
-
## 🔄 What's changed recently?
76
+
## What's changed recently?
66
77
67
78
We publish a fortnightly newsletter summarising recent updates in the TuringLang ecosystem, which you can view on [our website](https://turinglang.org/news/), [GitHub](https://github.com/TuringLang/Turing.jl/issues/2498), or [Julia Slack](https://julialang.slack.com/archives/CCYDC34A0).
68
79
69
80
For Turing.jl specifically, you can see a full changelog in [`HISTORY.md`](https://github.com/TuringLang/Turing.jl/blob/main/HISTORY.md) or [our GitHub releases](https://github.com/TuringLang/Turing.jl/releases).
70
81
71
-
## 🧩 Where does Turing.jl sit in the TuringLang ecosystem?
82
+
## Where does Turing.jl sit in the TuringLang ecosystem?
72
83
73
84
Turing.jl is the main entry point for users, and seeks to provide a unified, convenient interface to all of the functionality in the TuringLang (and broader Julia) ecosystem.
74
85
@@ -125,5 +136,3 @@ month = feb,
125
136
```
126
137
127
138
</details>
128
-
129
-
You can see the full list of publications that have cited Turing.jl on [Google Scholar](https://scholar.google.com/scholar?cites=11803241473159708991).
0 commit comments