Skip to content

Conversation

penelopeysm
Copy link
Member

@penelopeysm penelopeysm commented Oct 16, 2025

TuringLang/DynamicPPL.jl#1037 deletes DynamicPPL.Sampler and its interface. This PR makes the necessary upstream adjustments.

Once this PR and the DPPL counterpart are merged, all of the MCMC sampling behaviour will be self-contained in Turing: this means that we can easily refactor sampler code in Turing without being held back by interface code in DynamicPPL. This will finally pave the way for things like sampling with LogDensityFunction #2555 and allow long-standing things to be fixed like weird parameters to NUTS #2678. It's been a few months coming so I am very pleased with this!

Literally all of the actual code changes in this PR are perfunctory boilerplate, like copy-pastes from DynamicPPL, changing x::Sampler{<:X} to x::X, etc. However, there is one conceptual bit that I think is worth explaining:

Special behaviour when sampling from Turing models with Turing samplers

As discussed in, e.g. #2413, there are certain special behaviours that are enabled when somebody calls sample(model, NUTS(), 1000). For example:

  • check_model runs checks on the model,
  • initial_params is set to DynamicPPL.init_strategy(spl) (now Turing.Inference.init_strategy(spl)),
  • chain_type is set to MCMCChains.Chains, ...

Previously the way this was accomplished was by

  1. declaring NUTS <: InferenceAlgorithm,
  2. sample(rng, ::AbstractModel, ::InferenceAlgorithm, N) would then wrap NUTS in DynamicPPL.Sampler
  3. sample(rng, ::AbstractModel, ::DynamicPPL.Sampler, N) would then do all the special behaviour

This PR changes it such that the special behaviour is triggered in sample(rng, ::DynamicPPL.Model, ::AbstractSampler), in src/mcmc/abstractmcmc.jl. That is to say, NUTS is no longer 'a special class of sampler'. If you write your own sampler that knows how to handle Turing models (i.e. if you define step(rng, ::DynamicPPL.Model, ::MySampler)), then you will get all the special behaviour when you call sample(model, MyNewSampler(), 1000).

This approach is a lot simpler than what came before, but as pointed out in #2413 there are some concerns with method ambiguities. For example, if somebody defines

sample(::AbstractModel, ::MyNewSampler, N) = ...

then calling sample(dynamicppl_model, MyNewSampler(), 1000) will lead to method ambiguities. However, it's been almost a year since then, and there are a number of reasons why I'm not really fussed about the ambiguities and why I no longer agree with the arguments in that thread (indeed I don't agree with my own comments from that thread).

I've gated the rest behind a details tag to avoid overwhelming. If you don't see a problem with the above, then feel free to ignore.

Method ambiguities
  1. As explained in Remove (duplicate) samplers being defined explicitly in Turing.jl #2413 the only meaningful case where somebody might write such a sample method is if their sampler was a 'meta-sampler', i.e., it works with all models without knowing anything about the internal structure of said models. In practice this cannot be done because AbstractModel has no interface and if you don't know anything about the model, you can't do anything meaningful with it. For example, in principle Gibbs shouldn't need to know anything about the model because it just shuttles it between component samplers. In practice we have to specialise on the model type and include a ton of stuff like setparams_varinfo to make it even work.

  2. Method ambiguities have a very straightforward solution which is to declaring an extra, more specific method. I think that is a perfectly fine price to pay if you are writing a (genuine) meta-sampler and want it to work with Turing models specifically. For example this is the case with RepeatSampler, it's just five extra lines of code to make it work.

  3. Note that in the original proposal in Remove (duplicate) samplers being defined explicitly in Turing.jl #2413 the method ambiguities would be avoided but if somebody wrote a model and wanted to hook into the special behaviour, they would have to write the more specific method anyway! So IMO this is no worse.

  4. I haven't exactly seen an explosion of activity in the Julia ecosystem around creating meta-samplers, so I'm quite unconvinced that in practice this will cause any problems

Copy link
Contributor

Turing.jl documentation for PR #2689 is available at:
https://TuringLang.github.io/Turing.jl/previews/PR2689/

@penelopeysm penelopeysm changed the title Remove Sampler, transfer initialstep, init_strategy, and other functions from DynamicPPL to Turing Remove Sampler, remove InferenceAlgorithm, transfer initialstep, init_strategy, and other functions from DynamicPPL to Turing Oct 16, 2025
Copy link

codecov bot commented Oct 17, 2025

Codecov Report

❌ Patch coverage is 92.38095% with 8 lines in your changes missing coverage. Please review.
✅ Project coverage is 85.15%. Comparing base (bbbde35) to head (aa49a80).

Files with missing lines Patch % Lines
src/mcmc/abstractmcmc.jl 90.90% 3 Missing ⚠️
src/mcmc/ess.jl 50.00% 2 Missing ⚠️
src/mcmc/gibbs.jl 84.61% 2 Missing ⚠️
src/mcmc/Inference.jl 0.00% 1 Missing ⚠️
Additional details and impacted files
@@                Coverage Diff                @@
##           py/dppl-0.38    #2689       +/-   ##
=================================================
+ Coverage         57.89%   85.15%   +27.26%     
=================================================
  Files                22       21        -1     
  Lines              1387     1415       +28     
=================================================
+ Hits                803     1205      +402     
+ Misses              584      210      -374     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Comment on lines +150 to +172
# TODO(penelopeysm): Remove initialstep and generalise MCMC sampling procedures
function initialstep end

function AbstractMCMC.step(
rng::Random.AbstractRNG,
model::DynamicPPL.Model,
spl::AbstractSampler;
initial_params,
kwargs...,
)
# Generate the default varinfo. Note that any parameters inside this varinfo
# will be immediately overwritten by the next call to `init!!`.
vi = default_varinfo(rng, model, spl)

# Fill it with initial parameters. Note that, if `InitFromParams` is used, the
# parameters provided must be in unlinked space (when inserted into the
# varinfo, they will be adjusted to match the linking status of the
# varinfo).
_, vi = DynamicPPL.init!!(rng, model, vi, initial_params)

# Call the actual function that does the first step.
return initialstep(rng, model, spl, vi; initial_params, kwargs...)
end
Copy link
Member Author

@penelopeysm penelopeysm Oct 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method of step is actually a little bit evil. It used to be less bad because it only applied to Sampler{<:InferenceAlgorithm}, but now it applies to all AbstractSampler, which actually does cause some method ambiguities (which I've pointed out in my other comments).

On top of that, this is just generally a bit inflexible when it comes to warmup steps since it's only defined as a method for step and not step_warmup.

I think that in the next version of Turing this method should be removed. However, I've opted to preserve it for now because I don't want to make too many conceptual changes in this PR (the diff is already too large).

Comment on lines +107 to +120
# Needed for method ambiguity resolution, even though this method is never going to be
# called in practice. This just shuts Aqua up.
# TODO(penelopeysm): Remove this when the default `step(rng, ::DynamicPPL.Model,
# ::AbstractSampler) method in `src/mcmc/abstractmcmc.jl` is removed.
function AbstractMCMC.step(
rng::AbstractRNG,
model::DynamicPPL.Model,
sampler::EllipticalSliceSampling.ESS;
kwargs...,
)
return error(
"This method is not implemented! If you want to use the ESS sampler in Turing.jl, please use `Turing.ESS()` instead. If you want the default behaviour in EllipticalSliceSampling.jl, wrap your model in a different subtype of `AbstractMCMC.AbstractModel`, and then implement the necessary EllipticalSliceSampling.jl methods on it.",
)
end
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines +44 to +51
# The following method needed for method ambiguity resolution.
# TODO(penelopeysm): Remove this method once the default `AbstractMCMC.step(rng,
# ::DynamicPPL.Model, ::AbstractSampler)` method in `src/mcmc/abstractmcmc.jl` is removed.
function AbstractMCMC.step(
rng::Random.AbstractRNG, model::DynamicPPL.Model, sampler::RepeatSampler; kwargs...
)
return AbstractMCMC.step(rng, model, sampler.sampler; kwargs...)
end
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the second method ambiguity caused by step.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm confused as to why this is a method ambiguity. RepeatSampler is a subtype of AbstractSampler and should thus take precedence given the other positional arguments are the same. Kwargs don't do dispatch. I think I'm missing something.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Errr, sorry, I was very unclear. It's the method above this that used AbstractModel was the problem. This one is the solution.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My bad for not reading the PR properly. If you just need the two methods to disambiguate, and the plan is to remove the offending method soonish (a plan I like, initialstep is confusing), then I'm not too offended.

@penelopeysm penelopeysm requested a review from sunxd3 October 20, 2025 11:42
@penelopeysm
Copy link
Member Author

@mhauru pinging you also for thoughts on the method ambiguity thing, if you have any

Copy link
Member

@sunxd3 sunxd3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not too bothered with the method ambiguity.

sampler_wrapper::ExternalSampler;
initial_state=nothing,
initial_params=DynamicPPL.init_strategy(sampler_wrapper.alg.sampler),
initial_params=DynamicPPL.init_strategy(sampler_wrapper.sampler),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
initial_params=DynamicPPL.init_strategy(sampler_wrapper.sampler),
initial_params=Turing.Inference.init_strategy(sampler_wrapper.sampler),

perhaps?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's quite awkward, this implies that it wasn't tested lol

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, actually, it seems that the default value just wasn't needed, because sample() always passes initial_params through to step(), and sample() already has a default value. Removed now.

spl::DynamicPPL.Sampler{<:Gibbs};
initial_params::DynamicPPL.AbstractInitStrategy=DynamicPPL.init_strategy(spl),
spl::Gibbs;
initial_params=Turing.Inference.init_strategy(spl),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

slightly tricky, this would technically override the init_strategy behavior for the component samplers, right? for instance HMC, when used for a component will default to sampling from prior instead of uniform?

Copy link
Member Author

@penelopeysm penelopeysm Oct 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup indeed. I think this was already the case prior to this, because initial_params would be generated as a single vector (using initialsampler(::Sampler{Gibbs}), which would be SampleFromPrior) and then just split up between the components. I'm actually not sure how to fix this, but I can open an issue to track it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most likely the solution would be to implement a custom InitStrategy for Gibbs

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@penelopeysm penelopeysm requested a review from sunxd3 October 21, 2025 10:19
Copy link
Member

@sunxd3 sunxd3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks great, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants