-
Notifications
You must be signed in to change notification settings - Fork 228
Closed as not planned
Description
This method currently requires re-evaluating the model:
Turing.jl/src/mcmc/Inference.jl
Lines 345 to 361 in 3901096
function getparams(model::DynamicPPL.Model, vi::DynamicPPL.VarInfo) | |
# NOTE: In the past, `invlink(vi, model)` + `values_as(vi, OrderedDict)` was used. | |
# Unfortunately, using `invlink` can cause issues in scenarios where the constraints | |
# of the parameters change depending on the realizations. Hence we have to use | |
# `values_as_in_model`, which re-runs the model and extracts the parameters | |
# as they are seen in the model, i.e. in the constrained space. Moreover, | |
# this means that the code below will work both of linked and invlinked `vi`. | |
# Ref: https://github.com/TuringLang/Turing.jl/issues/2195 | |
# NOTE: We need to `deepcopy` here to avoid modifying the original `vi`. | |
vals = DynamicPPL.values_as_in_model(model, true, deepcopy(vi)) | |
# Obtain an iterator over the flattened parameter names and values. | |
iters = map(DynamicPPL.varname_and_value_leaves, keys(vals), values(vals)) | |
# Materialize the iterators and concatenate. | |
return mapreduce(collect, vcat, iters) | |
end |
Once TuringLang/DynamicPPL.jl#908 is merged and released, we should change it such that that accumulator is always used during model evaluation and change this method to just extract its contents.
Metadata
Metadata
Assignees
Labels
No labels