-
Notifications
You must be signed in to change notification settings - Fork 229
Remove Sampler
, remove InferenceAlgorithm
, transfer initialstep
, init_strategy
, and other functions from DynamicPPL to Turing
#2689
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: py/dppl-0.38
Are you sure you want to change the base?
Conversation
Turing.jl documentation for PR #2689 is available at: |
Sampler
, transfer initialstep
, init_strategy
, and other functions from DynamicPPL to TuringSampler
, remove InferenceAlgorithm
, transfer initialstep
, init_strategy
, and other functions from DynamicPPL to Turing
b6de6ae
to
42bfc8a
Compare
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## py/dppl-0.38 #2689 +/- ##
=================================================
+ Coverage 57.89% 84.34% +26.45%
=================================================
Files 22 21 -1
Lines 1387 1412 +25
=================================================
+ Hits 803 1191 +388
+ Misses 584 221 -363 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
# Needed for method ambiguity resolution, even though this method is never going to be | ||
# called in practice. This just shuts Aqua up. | ||
# TODO(penelopeysm): Remove this when the default `step(rng, ::DynamicPPL.Model, | ||
# ::AbstractSampler) method in `src/mcmc/abstractmcmc.jl` is removed. | ||
function AbstractMCMC.step( | ||
rng::AbstractRNG, | ||
model::DynamicPPL.Model, | ||
sampler::EllipticalSliceSampling.ESS; | ||
kwargs..., | ||
) | ||
return error( | ||
"This method is not implemented! If you want to use the ESS sampler in Turing.jl, please use `Turing.ESS()` instead. If you want the default behaviour in EllipticalSliceSampling.jl, wrap your model in a different subtype of `AbstractMCMC.AbstractModel`, and then implement the necessary EllipticalSliceSampling.jl methods on it.", | ||
) | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For full transparency, I will confess that I had to add two methods to remove method ambiguities caused by this PR. This is the first one.
Both arise from the default definition of AbstractMCMC.step
Turing.jl/src/mcmc/abstractmcmc.jl
Lines 141 to 163 in b5e0760
# TODO(penelopeysm): Remove initialstep and generalise MCMC sampling procedures | |
function initialstep end | |
function AbstractMCMC.step( | |
rng::Random.AbstractRNG, | |
model::DynamicPPL.Model, | |
spl::AbstractSampler; | |
initial_params, | |
kwargs..., | |
) | |
# Generate the default varinfo. Note that any parameters inside this varinfo | |
# will be immediately overwritten by the next call to `init!!`. | |
vi = default_varinfo(rng, model, spl) | |
# Fill it with initial parameters. Note that, if `InitFromParams` is used, the | |
# parameters provided must be in unlinked space (when inserted into the | |
# varinfo, they will be adjusted to match the linking status of the | |
# varinfo). | |
_, vi = DynamicPPL.init!!(rng, model, vi, initial_params) | |
# Call the actual function that does the first step. | |
return initialstep(rng, model, spl, vi; initial_params, kwargs...) | |
end |
which I would like to remove (because it doesn't allow for warmup), but I don't want to do it in this PR / version.
# The following method needed for method ambiguity resolution. | ||
# TODO(penelopeysm): Remove this method once the default `AbstractMCMC.step(rng, | ||
# ::DynamicPPL.Model, ::AbstractSampler)` method in `src/mcmc/abstractmcmc.jl` is removed. | ||
function AbstractMCMC.step( | ||
rng::Random.AbstractRNG, model::DynamicPPL.Model, sampler::RepeatSampler; kwargs... | ||
) | ||
return AbstractMCMC.step(rng, model, sampler.sampler; kwargs...) | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the other one.
TuringLang/DynamicPPL.jl#1037 deletes
DynamicPPL.Sampler
and its interface. This PR makes the necessary upstream adjustments.Once this PR and the DPPL counterpart are merged, all of the MCMC sampling behaviour will be self-contained in Turing: this means that we can easily refactor sampler code in Turing without being held back by interface code in DynamicPPL. This will finally pave the way for things like sampling with
LogDensityFunction
#2555 and allow long-standing things to be fixed like weird parameters to NUTS #2678. It's been a few months coming so I am very pleased with this!Literally all of the actual code changes in this PR are perfunctory boilerplate, like copy-pastes from DynamicPPL, changing
x::Sampler{<:X}
tox::X
, etc. However, there is one conceptual bit that I think is worth explaining:Special behaviour when sampling from Turing models with Turing samplers
As discussed in, e.g. #2413, there are certain special behaviours that are enabled when somebody calls
sample(model, NUTS(), 1000)
. For example:check_model
runs checks on the model,initial_params
is set toDynamicPPL.init_strategy(spl)
(nowTuring.Inference.init_strategy(spl)
),chain_type
is set toMCMCChains.Chains
, ...Previously the way this was accomplished was by
NUTS <: InferenceAlgorithm
,sample(rng, ::AbstractModel, ::InferenceAlgorithm, N)
would then wrap NUTS inDynamicPPL.Sampler
sample(rng, ::AbstractModel, ::DynamicPPL.Sampler, N)
would then do all the special behaviourThis PR changes it such that the special behaviour is triggered in
sample(rng, ::DynamicPPL.Model, ::AbstractSampler)
, insrc/mcmc/abstractmcmc.jl
. That is to say, NUTS is no longer 'a special class of sampler'. If you write your own sampler that knows how to handle Turing models (i.e. if you definestep(rng, ::DynamicPPL.Model, ::MySampler)
), then you will get all the special behaviour when you callsample(model, MyNewSampler(), 1000)
.This approach is a lot simpler than what came before, but as pointed out in #2413 there are some concerns with method ambiguities. For example, if somebody defines
then calling
sample(dynamicppl_model, MyNewSampler(), 1000)
will lead to method ambiguities. However, it's been almost a year since then, and there are a number of reasons why I'm not really fussed about the ambiguities and why I no longer agree with the arguments in that thread (indeed I don't agree with my own comments from that thread).I've gated the rest behind a details tag to avoid overwhelming. If you don't see a problem with the above, then feel free to ignore.
Method ambiguities
As explained in Remove (duplicate) samplers being defined explicitly in Turing.jl #2413 the only meaningful case where somebody might write such a
sample
method is if their sampler was a 'meta-sampler', i.e., it works with all models without knowing anything about the internal structure of said models. In practice this cannot be done becauseAbstractModel
has no interface and if you don't know anything about the model, you can't do anything meaningful with it. Look at Gibbs, for example: in principle it shouldn't need to know anything about the model because it just shuttles it between component samplers. In practice you have to specialise on the model type and include a ton of stuff likesetparams_varinfo
just to make it even work.Method ambiguities have a very straightforward solution which is to declaring an extra, more specific method. I think that is a perfectly fine price to pay if you are writing a (genuine) meta-sampler and want it to work with Turing models specifically. For example this is the case with
RepeatSampler
, it's just five extra lines of code to make it work.Note that in the original proposal in Remove (duplicate) samplers being defined explicitly in Turing.jl #2413 the method ambiguities would be avoided but if somebody wrote a model and wanted to hook into the special behaviour, they would have to write the more specific method anyway! So IMO this is no worse.
I haven't exactly seen an explosion of activity in the Julia ecosystem around creating meta-samplers, so I'm quite unconvinced that in practice this will cause any problems
I think the way the code is now written (i.e. in this PR) is far closer to what is intended. Special behaviour is tied to the fact that you are sampling a
DynamicPPL.Model
which brings with it a rich structure that lets you check its validity, lets you group its outputs into chains, etc. Previously, special behaviour was tied to the fact that "it was a sampler defined in Turing.jl", which was not a very meaningful metric for association.