Skip to content

DynamicPPL 0.36 #2535

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,23 @@
# Release 0.38.0

DynamicPPL compatibility has been bumped to 0.36.
This brings with it a number of changes: the ones most likely to affect you are submodel prefixing and conditioning.
Variables in submodels are now represented correctly with field accessors.
For example:

```julia
using Turing
@model inner() = x ~ Normal()
@model outer() = a ~ to_submodel(inner())
```

`keys(VarInfo(outer()))` now returns `[@varname(a.x)]` instead of `[@varname(var"a.x")]`

Furthermore, you can now either condition on the outer model like `outer() | (@varname(a.x) => 1.0)`, or the inner model like `inner() | (@varname(x) => 1.0)`.
If you use the conditioned inner model as a submodel, the conditioning will still apply correctly.

Please see [the DynamicPPL release notes](https://github.com/TuringLang/DynamicPPL.jl/releases/tag/v0.36.0) for fuller details.

# Release 0.37.1

`maximum_a_posteriori` and `maximum_likelihood` now perform sanity checks on the model before running the optimisation.
Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Turing"
uuid = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
version = "0.37.1"
version = "0.38.0"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand Down Expand Up @@ -62,7 +62,7 @@ Distributions = "0.25.77"
DistributionsAD = "0.6"
DocStringExtensions = "0.8, 0.9"
DynamicHMC = "3.4"
DynamicPPL = "0.35"
DynamicPPL = "0.36"
EllipticalSliceSampling = "0.5, 1, 2"
ForwardDiff = "0.10.3"
Libtask = "0.8.8"
Expand Down
2 changes: 1 addition & 1 deletion docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ even though [`Prior()`](@ref) is actually defined in the `Turing.Inference` modu
| `@model` | [`DynamicPPL.@model`](@extref) | Define a probabilistic model |
| `@varname` | [`AbstractPPL.@varname`](@extref) | Generate a `VarName` from a Julia expression |
| `to_submodel` | [`DynamicPPL.to_submodel`](@extref) | Define a submodel |
| `prefix` | [`DynamicPPL.prefix`](@extref) | Prefix all variable names in a model with a given symbol |
| `prefix` | [`DynamicPPL.prefix`](@extref) | Prefix all variable names in a model with a given VarName |
| `LogDensityFunction` | [`DynamicPPL.LogDensityFunction`](@extref) | A struct containing all information about how to evaluate a model. Mostly for advanced users |

### Inference
Expand Down
5 changes: 2 additions & 3 deletions src/mcmc/Inference.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,8 @@
using DynamicPPL:
Metadata,
VarInfo,
TypedVarInfo,
# TODO(mhauru) all_varnames_grouped_by_symbol isn't exported by DPPL, because it is only
# implemented for TypedVarInfo. It is used by mh.jl. Either refactor mh.jl to not use it
# implemented for NTVarInfo. It is used by mh.jl. Either refactor mh.jl to not use it
# or implement it for other VarInfo types and export it from DPPL.
all_varnames_grouped_by_symbol,
syms,
Expand Down Expand Up @@ -161,7 +160,7 @@
end

# TODO: make a nicer `set_namedtuple!` and move these functions to DynamicPPL.
function DynamicPPL.unflatten(vi::TypedVarInfo, θ::NamedTuple)
function DynamicPPL.unflatten(vi::DynamicPPL.NTVarInfo, θ::NamedTuple)

Check warning on line 163 in src/mcmc/Inference.jl

View check run for this annotation

Codecov / codecov/patch

src/mcmc/Inference.jl#L163

Added line #L163 was not covered by tests
set_namedtuple!(deepcopy(vi), θ)
return vi
end
Expand Down
15 changes: 11 additions & 4 deletions src/mcmc/gibbs.jl
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,11 @@ isgibbscomponent(spl::ExternalSampler) = isgibbscomponent(spl.sampler)
isgibbscomponent(::AdvancedHMC.HMC) = true
isgibbscomponent(::AdvancedMH.MetropolisHastings) = true

function can_be_wrapped(ctx::DynamicPPL.AbstractContext)
return DynamicPPL.NodeTrait(ctx) isa DynamicPPL.IsLeaf
end
can_be_wrapped(ctx::DynamicPPL.PrefixContext) = can_be_wrapped(ctx.context)

# Basically like a `DynamicPPL.FixedContext` but
# 1. Hijacks the tilde pipeline to fix variables.
# 2. Computes the log-probability of the fixed variables.
Expand Down Expand Up @@ -68,14 +73,14 @@ struct GibbsContext{VNs,GVI<:Ref{<:AbstractVarInfo},Ctx<:DynamicPPL.AbstractCont
context::Ctx

function GibbsContext{VNs}(global_varinfo, context) where {VNs}
if !(DynamicPPL.NodeTrait(context) isa DynamicPPL.IsLeaf)
if !can_be_wrapped(context)
error("GibbsContext can only wrap a leaf context, not a $(context).")
end
return new{VNs,typeof(global_varinfo),typeof(context)}(global_varinfo, context)
end

function GibbsContext(target_varnames, global_varinfo, context)
if !(DynamicPPL.NodeTrait(context) isa DynamicPPL.IsLeaf)
if !can_be_wrapped(context)
error("GibbsContext can only wrap a leaf context, not a $(context).")
end
if any(vn -> DynamicPPL.getoptic(vn) != identity, target_varnames)
Expand Down Expand Up @@ -232,9 +237,11 @@ end
wrap_in_sampler(x::AbstractMCMC.AbstractSampler) = x
wrap_in_sampler(x::InferenceAlgorithm) = DynamicPPL.Sampler(x)

to_varname_list(x::Union{VarName,Symbol}) = [VarName(x)]
to_varname(x::VarName) = x
to_varname(x::Symbol) = VarName{x}()
to_varname_list(x::Union{VarName,Symbol}) = [to_varname(x)]
# Any other value is assumed to be an iterable of VarNames and Symbols.
to_varname_list(t) = collect(map(VarName, t))
to_varname_list(t) = collect(map(to_varname, t))

"""
Gibbs
Expand Down
4 changes: 2 additions & 2 deletions test/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ TimerOutputs = "a759f4b9-e2f1-59dc-863e-4aeb61b1ea8f"

[compat]
AbstractMCMC = "5"
AbstractPPL = "0.9, 0.10"
AbstractPPL = "0.9, 0.10, 0.11"
AdvancedMH = "0.6, 0.7, 0.8"
AdvancedPS = "=0.6.0"
AdvancedVI = "0.2"
Expand All @@ -52,7 +52,7 @@ Combinatorics = "1"
Distributions = "0.25"
DistributionsAD = "0.6.3"
DynamicHMC = "2.1.6, 3.0"
DynamicPPL = "0.35"
DynamicPPL = "0.36"
FiniteDifferences = "0.10.8, 0.11, 0.12"
ForwardDiff = "0.10.12 - 0.10.32, 0.10"
HypothesisTests = "0.11"
Expand Down
10 changes: 3 additions & 7 deletions test/optimisation/Optimisation.jl
Original file line number Diff line number Diff line change
Expand Up @@ -71,13 +71,9 @@ using Turing
end

@testset "With prefixes" begin
function prefix_μ(model)
return DynamicPPL.contextualize(
model, DynamicPPL.PrefixContext{:inner}(model.context)
)
end
m1 = prefix_μ(model1(x))
m2 = prefix_μ(model2() | (var"inner.x"=x,))
vn = @varname(inner)
m1 = prefix(model1(x), vn)
m2 = prefix((model2() | (x=x,)), vn)
ctx = Turing.Optimisation.OptimizationContext(DynamicPPL.LikelihoodContext())
@test Turing.Optimisation.OptimLogDensity(m1, ctx)(w) ==
Turing.Optimisation.OptimLogDensity(m2, ctx)(w)
Expand Down
Loading