Skip to content
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
1d75f60
Bump minor version
penelopeysm Dec 5, 2025
97059d3
Merge remote-tracking branch 'origin/main' into breaking
penelopeysm Jan 7, 2026
63d8281
Merge branch 'main' into breaking
penelopeysm Jan 8, 2026
0477b65
move stats.jl to separate file
penelopeysm Jan 8, 2026
46b5373
[skip ci] Merge remote-tracking branch 'origin/main' into breaking
penelopeysm Jan 9, 2026
f7d878b
Optimisation interface rework (#2747)
penelopeysm Jan 26, 2026
0059311
Export get_vector_params since it's useful
penelopeysm Jan 26, 2026
4b3a969
Document
penelopeysm Jan 26, 2026
f6fcd2d
Merge branch 'main' into breaking
penelopeysm Jan 26, 2026
8660a19
Merge branch 'main' into breaking
penelopeysm Jan 26, 2026
19730fe
Seed a test properly
penelopeysm Jan 26, 2026
2eb47a0
Merge branch 'main' into breaking
penelopeysm Jan 28, 2026
22bcc2a
Run CI against new DPPL / VNT (#2756)
penelopeysm Feb 7, 2026
c2208e4
Merge branch 'main' into breaking
penelopeysm Feb 7, 2026
0aad2b9
Merge remote-tracking branch 'origin/main' into breaking
penelopeysm Feb 7, 2026
fd3ab94
Changelog
penelopeysm Feb 10, 2026
68ccbf3
fixes for densify
penelopeysm Feb 10, 2026
f9ee09f
fix gibbs tests
penelopeysm Feb 12, 2026
d79e8b7
Silence Aqua on `Libtask.might_produce`
penelopeysm Feb 23, 2026
992efff
Remove size from LinkedVectorValue
penelopeysm Feb 23, 2026
ee4c4f2
Fix typo
penelopeysm Feb 23, 2026
8442ce2
Add template to tilde_observe!!
penelopeysm Feb 23, 2026
c162b0a
Add isnan check to MH initialisation
penelopeysm Feb 23, 2026
4ea8d8d
Add template to accumulate_observe!!
penelopeysm Feb 23, 2026
ad19711
Tell users about the proposals that are being used in MH (#2774)
penelopeysm Feb 24, 2026
7e1f9bf
Add Logging test dep
penelopeysm Feb 24, 2026
66e28d9
Fix test import
penelopeysm Feb 24, 2026
948f09c
More imports...
penelopeysm Feb 24, 2026
dde2fc1
DPPL is released
penelopeysm Feb 24, 2026
4c3d67c
Fix tests
penelopeysm Feb 25, 2026
92b9c63
fix Enzyme project env
penelopeysm Feb 25, 2026
6001b23
fix LinkedRW printing
penelopeysm Feb 28, 2026
0cdbfe5
Implement checks for discrete variables with HMC
penelopeysm Mar 2, 2026
d743b4b
Merge branch 'main' into breaking
penelopeysm Mar 2, 2026
ac4c9df
Fix tests
penelopeysm Mar 2, 2026
9fdd23f
Bump min DPPL
penelopeysm Mar 2, 2026
3a834ea
Improve changelog
penelopeysm Mar 2, 2026
772330a
Rename get_vector_params -> vector_names_and_params
penelopeysm Mar 2, 2026
f87f1fa
Rewrite optimisation docs
penelopeysm Mar 2, 2026
b8f7a2e
use VectorBijectors in optimisation
penelopeysm Mar 2, 2026
c5b1f72
use get_priors in ESS
penelopeysm Mar 2, 2026
6f4a17c
point to DPPL branch
penelopeysm Mar 3, 2026
ff574e9
Squish more bugs
penelopeysm Mar 3, 2026
2c24742
fix more things
penelopeysm Mar 3, 2026
de8655f
fix tests
penelopeysm Mar 3, 2026
78b5aa3
Add a test for the LKJCholesky optimisation
penelopeysm Mar 3, 2026
f391102
some pMCMC docstrings
penelopeysm Mar 3, 2026
51f5558
Clarify abstract type parameter
penelopeysm Mar 3, 2026
8b8bc8d
Merge remote-tracking branch 'origin/main' into breaking
penelopeysm Mar 3, 2026
1845c9a
tidy
penelopeysm Mar 3, 2026
c10ae92
fix vec
penelopeysm Mar 3, 2026
8006bb7
Remove DistributionsAD
penelopeysm Mar 3, 2026
e0aae9f
fix import
penelopeysm Mar 3, 2026
8b4170f
Fix imports
penelopeysm Mar 4, 2026
01d5a4a
changelog
penelopeysm Mar 4, 2026
8e0d448
Fix CI hang due to depwarn
penelopeysm Mar 4, 2026
1369bf5
increase atol on PG tests
penelopeysm Mar 4, 2026
427fab9
Update patch notes
penelopeysm Mar 4, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
91 changes: 91 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,94 @@
# 0.43.0

## DynamicPPL 0.40 and `VarNamedTuple`

DynamicPPL v0.40 is a major overhaul of the internal data structure.
Most notably, cases where we might once have used `Dict{VarName}` or `NamedTuple` have all been replaced with a single data structure, called `VarNamedTuple`.

This provides substantial benefits in terms of robustness and performance.

However, it does place some constraints on Turing models.
Specifically:

- Arrays whose elements are random variables should have a constant size and type. For example, the following will no longer work:

```julia
x = Float64[]
for i in 1:10
push!(x, 0.0)
x[i] ~ Normal()
end
```

This only applies to arrays on the left-hand side of tilde-statements. In general, there are no restrictions on the code that you can use outside of tilde-statements.

- Likewise, arrays of random variables should ideally have a constant size from iteration to iteration. That means a model like this will fail sometimes (*but* see below):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be worth pointing out either here or above at the beginning of these bullet points that this only applies when using indexing, and doing the multivariate distribution version of the below is entirely fine?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tweaked wording


```julia
n ~ Poisson(2.0)
x = Vector{Float64}(undef, n)
for i in 1:n
x[i] ~ Normal()
end
```

*Technically speaking*: Inference (e.g. MCMC sampling) on this model will still work, but if you want to use `returned` or `predict`, both of the following conditions must hold true: (1) you must use FlexiChains.jl; (2) all elements of `x` must be random variables, i.e., you cannot have a mixture of `x[i]`'s being random variables and observed.
- TODO

## Optimisation interface

Turing.jl's optimisation interface has been completely overhauled in this release.
The aim of this has been to provide users with a more consistent and principled way of specifying constraints.

The crux of the issue is that Optimization.jl expects vectorised inputs, whereas Turing models are more high-level: they have named variables which may be scalars, vectors, or in general anything.
Prior to this version, Turing's interface required the user to provide the vectorised inputs 'raw', which is both unintuitive and error-prone (especially when considering that optimisation may run in linked or unlinked space).

Going forward, initial parameters for optimisation are specified using `AbstractInitStrategy`.
If specific parameters are provided (via `InitFromParams`), these must be in model space (i.e. untransformed).
This directly mimics the interface for MCMC sampling that has been in place since v0.41.

Furthermore, lower and upper bounds (if desired) must be specified as either a NamedTuple or an AbstractDict{<:VarName}.
Bounds are always provided in model space; Turing will handle the transformation of these bounds to linked space if necessary.
Constraints are respected when creating initial parameters for optimisation: if the `AbstractInitStrategy` provided is incompatible with the constraints (for example `InitFromParams((; x = 2.0))` but `x` is constrained to be between `[0, 1]`), an error will be raised.

Here is a (very simplified) example of the new interface:

```julia
using Turing
@model f() = x ~ Beta(2, 2)
maximum_a_posteriori(
f();
# All of the following are in unlinked space.
initial_params=InitFromParams((; x=0.3)),
lb=(; x=0.1),
ub=(; x=0.4),
)
```

For more information, please see the docstring of `estimate_mode`.

Note that in some cases, the translation of bounds to linked space may not be well-defined.
This is especially true for distributions where the samples have elements that are not independent (for example, Dirichlet, or LKJCholesky).
In these cases, Turing will raise an error if bounds are provided.
Users who wish to perform optimisation with such constraints should directly use `LogDensityFunction` and Optimization.jl.
Documentation on this matter will be forthcoming.

## `IS` sampler

The `IS` sampler has been removed (its behaviour was in fact exactly the same as `Prior`).
To see an example of importance sampling (via `Prior()` and then subsequent reweighting), see e.g. [this issue](https://github.com/TuringLang/Turing.jl/issues/2767).

## `MH` sampler

The interface of the MH sampler is slightly different.
It no longer accepts `AdvancedMH` proposals, and is now more flexible: you can specify proposals for individual `VarName`s (not just top-level symbols), and any unspecified `VarName`s will be drawn from the prior, instead of being silently ignored.
It is also faster than before (by around 30% on simple models).

## `GibbsConditional`

When defining a conditional posterior, instead of being provided with a Dict of values, the function must now take a `VarNamedTuple` containing the values.
Note that indexing into a `VarNamedTuple` is very similar to indexing into a `Dict`; however, it is more flexible since you can use syntax such as `x[1:2]` even if `x[1]` and `x[2]` are separate variables in the model.

# 0.42.8

Add support for `TensorBoardLogger.jl` via `AbstractMCMC.mcmc_callback`.
Expand Down
15 changes: 9 additions & 6 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Turing"
uuid = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
version = "0.42.8"
version = "0.43.0"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand All @@ -15,6 +15,7 @@ BangBang = "198e06fe-97b7-11e9-32a5-e1d131e6ad66"
Bijectors = "76274a88-744f-5084-9051-94815aaf08c4"
Compat = "34da2185-b29b-5c13-b0c7-acf172513d20"
DataStructures = "864edb3b-99cc-5e75-8d2d-829cb0a9cfe8"
DifferentiationInterface = "a0c0ee7d-e4b9-4e03-894e-1c5f64a51d63"
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
DistributionsAD = "ced4e74d-a319-5a8a-b0ac-84af2272839c"
DocStringExtensions = "ffbed154-4ef7-542d-bbb7-c09d3a79fcae"
Expand All @@ -25,7 +26,6 @@ Libtask = "6f1fad26-d15e-5dc8-ae53-837a1d7b8c9f"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
LogDensityProblems = "6fdf6af0-433a-55f7-b3ed-c6c6e0b8df7c"
MCMCChains = "c7f686f2-ff18-58e9-bc7b-31028e88f75d"
NamedArrays = "86f7a689-2022-50b4-a561-43c23ac3c673"
Optimization = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
OptimizationOptimJL = "36348300-93cb-4f02-beb5-3c3902f8871e"
OrderedCollections = "bac558e1-5e72-5ebc-8fee-abe8a469f55d"
Expand All @@ -42,13 +42,16 @@ StatsFuns = "4c63d2b9-4356-54db-8cca-17b64c39e42c"
[weakdeps]
DynamicHMC = "bbc10e6e-7c05-544b-b16e-64fede858acb"

[sources]
DynamicPPL = {rev = "breaking", url = "https://github.com/TuringLang/DynamicPPL.jl"}

[extensions]
TuringDynamicHMCExt = "DynamicHMC"

[compat]
ADTypes = "1.9"
AbstractMCMC = "5.13"
AbstractPPL = "0.11, 0.12, 0.13"
AbstractPPL = "0.14"
Accessors = "0.1"
AdvancedHMC = "0.8.3"
AdvancedMH = "0.8.9"
Expand All @@ -58,20 +61,20 @@ BangBang = "0.4.2"
Bijectors = "0.14, 0.15"
Compat = "4.15.0"
DataStructures = "0.18, 0.19"
DifferentiationInterface = "0.7"
Distributions = "0.25.77"
DistributionsAD = "0.6"
DocStringExtensions = "0.8, 0.9"
DynamicHMC = "3.4"
DynamicPPL = "0.39.1"
DynamicPPL = "0.40"
EllipticalSliceSampling = "0.5, 1, 2"
ForwardDiff = "0.10.3, 1"
Libtask = "0.9.5"
LinearAlgebra = "1"
LogDensityProblems = "2"
MCMCChains = "5, 6, 7"
NamedArrays = "0.9, 0.10"
Optimization = "3, 4, 5"
OptimizationOptimJL = "0.1, 0.2, 0.3, 0.4"
OptimizationOptimJL = "0.1 - 0.4"
OrderedCollections = "1"
Printf = "1"
Random = "1"
Expand Down
14 changes: 7 additions & 7 deletions docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,6 @@ even though [`Prior()`](@ref) is actually defined in the `Turing.Inference` modu
| `PolynomialStepsize` | [`Turing.Inference.PolynomialStepsize`](@ref) | Returns a function which generates polynomially decaying step sizes |
| `HMCDA` | [`Turing.Inference.HMCDA`](@ref) | Hamiltonian Monte Carlo with dual averaging |
| `NUTS` | [`Turing.Inference.NUTS`](@ref) | No-U-Turn Sampler |
| `IS` | [`Turing.Inference.IS`](@ref) | Importance sampling |
| `SMC` | [`Turing.Inference.SMC`](@ref) | Sequential Monte Carlo |
| `PG` | [`Turing.Inference.PG`](@ref) | Particle Gibbs |
| `CSMC` | [`Turing.Inference.CSMC`](@ref) | The same as PG |
Expand Down Expand Up @@ -169,9 +168,10 @@ LogPoisson

See the [mode estimation tutorial](https://turinglang.org/docs/tutorials/docs-17-mode-estimation/) for more information.

| Exported symbol | Documentation | Description |
|:---------------------- |:-------------------------------------------------- |:-------------------------------------------- |
| `maximum_a_posteriori` | [`Turing.Optimisation.maximum_a_posteriori`](@ref) | Find a MAP estimate for a model |
| `maximum_likelihood` | [`Turing.Optimisation.maximum_likelihood`](@ref) | Find a MLE estimate for a model |
| `MAP` | [`Turing.Optimisation.MAP`](@ref) | Type to use with Optim.jl for MAP estimation |
| `MLE` | [`Turing.Optimisation.MLE`](@ref) | Type to use with Optim.jl for MLE estimation |
| Exported symbol | Documentation | Description |
|:---------------------- |:-------------------------------------------------- |:--------------------------------------------- |
| `maximum_a_posteriori` | [`Turing.Optimisation.maximum_a_posteriori`](@ref) | Find a MAP estimate for a model |
| `maximum_likelihood` | [`Turing.Optimisation.maximum_likelihood`](@ref) | Find a MLE estimate for a model |
| `MAP` | [`Turing.Optimisation.MAP`](@ref) | Type to use with Optim.jl for MAP estimation |
| `MLE` | [`Turing.Optimisation.MLE`](@ref) | Type to use with Optim.jl for MLE estimation |
| `get_vector_params` | [`Turing.Optimisation.get_vector_params`](@ref) | Extract parameter names and values as vectors |
17 changes: 8 additions & 9 deletions ext/TuringDynamicHMCExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -43,20 +43,19 @@ struct DynamicNUTSState{L,C,M,S}
stepsize::S
end

function Turing.Inference.initialstep(
function AbstractMCMC.step(
rng::Random.AbstractRNG,
model::DynamicPPL.Model,
spl::DynamicNUTS,
vi::DynamicPPL.AbstractVarInfo;
spl::DynamicNUTS;
initial_params,
kwargs...,
)
# Ensure that initial sample is in unconstrained space.
if !DynamicPPL.is_transformed(vi)
vi = DynamicPPL.link!!(vi, model)
vi = last(DynamicPPL.evaluate!!(model, vi))
end

# Define log-density function.
# TODO(penelopeysm) We need to check that the initial parameters are valid. Same as how
# we do it for HMC
_, vi = DynamicPPL.init!!(
rng, model, DynamicPPL.VarInfo(), initial_params, DynamicPPL.LinkAll()
)
ℓ = DynamicPPL.LogDensityFunction(
model, DynamicPPL.getlogjoint_internal, vi; adtype=spl.adtype
)
Expand Down
13 changes: 8 additions & 5 deletions src/Turing.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,10 @@ using AdvancedVI: AdvancedVI
using DynamicPPL: DynamicPPL
import DynamicPPL: NoDist, NamedDist
using LogDensityProblems: LogDensityProblems
using NamedArrays: NamedArrays
using Accessors: Accessors
using StatsAPI: StatsAPI
using StatsBase: StatsBase
using AbstractMCMC

using Accessors: Accessors

using Printf: Printf
using Random: Random
using LinearAlgebra: I
Expand All @@ -45,6 +41,7 @@ end
# Random probability measures.
include("stdlib/distributions.jl")
include("stdlib/RandomMeasures.jl")
include("init_strategy.jl")
include("mcmc/Inference.jl") # inference algorithms
using .Inference
include("variational/Variational.jl")
Expand Down Expand Up @@ -73,6 +70,8 @@ using DynamicPPL:
conditioned,
to_submodel,
LogDensityFunction,
VarNamedTuple,
@vnt,
@addlogprob!,
InitFromPrior,
InitFromUniform,
Expand Down Expand Up @@ -102,6 +101,7 @@ export
# Samplers - Turing.Inference
Prior,
MH,
LinkedRW,
Emcee,
ESS,
Gibbs,
Expand All @@ -112,7 +112,6 @@ export
PolynomialStepsize,
HMCDA,
NUTS,
IS,
SMC,
PG,
CSMC,
Expand Down Expand Up @@ -166,12 +165,16 @@ export
InitFromPrior,
InitFromUniform,
InitFromParams,
# VNT,
VarNamedTuple,
@vnt,
# Point estimates - Turing.Optimisation
# The MAP and MLE exports are only needed for the Optim.jl interface.
maximum_a_posteriori,
maximum_likelihood,
MAP,
MLE,
get_vector_params,
# Chain save/resume
loadstate,
# kwargs in SMC
Expand Down
42 changes: 42 additions & 0 deletions src/init_strategy.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
using AbstractPPL: VarName
using DynamicPPL: DynamicPPL

# These functions are shared by both MCMC and optimisation, so has to exist outside of both.

"""
_convert_initial_params(initial_params)

Convert `initial_params` to a `DynamicPPl.AbstractInitStrategy` if it is not already one, or
throw a useful error message.
"""
_convert_initial_params(initial_params::DynamicPPL.AbstractInitStrategy) = initial_params
function _convert_initial_params(nt::NamedTuple)
@info "Using a NamedTuple for `initial_params` will be deprecated in a future release. Please use `InitFromParams(namedtuple)` instead."
return DynamicPPL.InitFromParams(nt)
end
function _convert_initial_params(d::AbstractDict{<:VarName})
@info "Using a Dict for `initial_params` will be deprecated in a future release. Please use `InitFromParams(dict)` instead."
return DynamicPPL.InitFromParams(d)
end
function _convert_initial_params(::AbstractVector{<:Real})
errmsg = "`initial_params` must be an `DynamicPPL.AbstractInitStrategy`. Using a vector of parameters for `initial_params` is no longer supported. Please see https://turinglang.org/docs/usage/sampling-options/#specifying-initial-parameters for details on how to update your code."
throw(ArgumentError(errmsg))
end
function _convert_initial_params(@nospecialize(_::Any))
errmsg = "`initial_params` must be a `DynamicPPL.AbstractInitStrategy`."
throw(ArgumentError(errmsg))
end

# TODO: Implement additional checks for certain samplers, e.g.
# HMC not supporting discrete parameters.
function _check_model(model::DynamicPPL.Model)
return DynamicPPL.check_model(model; error_on_failure=true)
end
function _check_model(model::DynamicPPL.Model, ::AbstractMCMC.AbstractSampler)
return _check_model(model)
end

# Similar to InitFromParams, this is just for convenience
_to_varnamedtuple(nt::NamedTuple) = DynamicPPL.VarNamedTuple(nt)
_to_varnamedtuple(d::AbstractDict{<:VarName}) = DynamicPPL.VarNamedTuple(pairs(d))
_to_varnamedtuple(vnt::DynamicPPL.VarNamedTuple) = vnt
Loading
Loading