-
Notifications
You must be signed in to change notification settings - Fork 229
Compatibility with DynamicPPL 0.38 + InitContext #2676
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
9658a3e
to
ed43a02
Compare
ed43a02
to
bf18516
Compare
bf18516
to
3a04643
Compare
# Get the initial values for this component sampler. | ||
initial_params_local = if initial_params === nothing | ||
nothing | ||
else | ||
DynamicPPL.subset(vi, varnames)[:] | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was quite pleased with this discovery. Previously the initial params had to be subsetted to be the correct length for the conditioned model. That's not only a faff, but also I get a bit scared whenever there's direct VarInfo manipulation like this.
Now, if you use InitFromParams with a NamedTuple/Dict that has extra params, the extra params are just ignored. So no need to subset it at all, just pass it through directly!
# TODO(DPPL0.38/penelopeysm): This function should no longer be needed | ||
# once InitContext is merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unfortunately set_namedtuple!
is used elsewhere in this file (though it won't appear in this diff) so we can't delete it (yet)
function DynamicPPL.tilde_assume!!( | ||
context::MHContext, right::Distribution, vn::VarName, vi::AbstractVarInfo | ||
) | ||
# Just defer to `SampleFromPrior`. | ||
retval = DynamicPPL.assume(rng, SampleFromPrior(), dist, vn, vi) | ||
return retval | ||
# Allow MH to sample new variables from the prior if it's not already present in the | ||
# VarInfo. | ||
dispatch_ctx = if haskey(vi, vn) | ||
DynamicPPL.DefaultContext() | ||
else | ||
DynamicPPL.InitContext(context.rng, DynamicPPL.InitFromPrior()) | ||
end | ||
return DynamicPPL.tilde_assume!!(dispatch_ctx, right, vn, vi) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The behaviour of SampleFromPrior
used to be: if the key is present, don't actually sample, and if it was absent, sample. This if/else replicates the old behaviour.
sampler::S | ||
varinfo::V | ||
evaluator::E | ||
resample::Bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For pMCMC, this Boolean field essentially replaces the del flag. Instead of set_all_del
and unset_all_del
we construct new TracedModel
with this set to true and false respectively.
Turing.jl documentation for PR #2676 is available at: |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## breaking #2676 +/- ##
===========================================
Coverage ? 86.45%
===========================================
Files ? 21
Lines ? 1418
Branches ? 0
===========================================
Hits ? 1226
Misses ? 192
Partials ? 0 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
seed = if dist isa GeneralizedExtremeValue | ||
# GEV is prone to giving really wacky results that are quite | ||
# seed-dependent. | ||
StableRNG(469) | ||
else | ||
StableRNG(468) | ||
end | ||
chn = sample(seed, m(), HMC(0.05, 20), n_samples) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Case in point:
julia> using Turing, StableRNGs
julia> dist = GeneralizedExtremeValue(0, 1, 0.5); @model m() = x ~ dist
m (generic function with 2 methods)
julia> mean(dist)
1.5449077018110322
julia> mean(sample(StableRNG(468), m(), HMC(0.05, 20), 10000; progress=false))
Mean
parameters mean
Symbol Float64
x 3.9024
julia> mean(sample(StableRNG(469), m(), HMC(0.05, 20), 10000; progress=false))
Mean
parameters mean
Symbol Float64
x 1.5868
For the record, 11 failing CI jobs is the expected number:
There is also the failing job caused by base Julia segfault (#2655), but that's on 1.10 so overlaps with the first category. |
6aed11b
to
6af6330
Compare
Okay, I'm quite happy with where CI on this PR has gotten to. There are a handful of residual failures, which are mostly to do with |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we discussed this last time round and IIRC you said you'd prefer to keep breaking 'clean' and mergeable into main at any given time (apologies if I am misremembering). If that's the case, then we should keep this as the base branch for DPPL 0.38 fixes, until 0.38 is released.
You remember correctly, this would be my preference.
Co-authored-by: Markus Hauru <[email protected]>
A note, mostly for myself: I'm happy with the code in this PR. We should collect other changes related to 0.38 compat into it before merging, since it still leaves things in a broken state. However, once other PRs have been merged into this so that tests pass, and the above reminder comments have been addressed, I'm happy to merge. No need to read through this PR again. |
…, `init_strategy`, and other functions from DynamicPPL to Turing (#2689) * Remove `Sampler` and move its interface to Turing * Test fixes (this is admittedly quite tiring) * Fix a couple of Gibbs tests (no doubt there are more) * actually fix the Gibbs ones * actually fix it this time * fix typo * point to breaking * Improve loadstate implementation * Re-add tests that were removed from DynamicPPL * Fix qualifier in src/mcmc/external_sampler.jl Co-authored-by: Xianda Sun <[email protected]> * Remove the default argument for initial_params * [skip ci] Remove DynamicPPL sources --------- Co-authored-by: Xianda Sun <[email protected]>
@mhauru Following on from the meeting just now ... is there anything else you'd like to include in 0.41? If not, I'll fix the remaining todo link in the changelog and then we should be good to go. Optimisation still doesn't use AbstractInitStrategy, but I'm quite happy to leave that out as part of a near-future refactoring of the optimisation API. |
On the contrary, if I we've managed to get all tests to pass on a DPPL version with so many changes, I would advocate for releasing this to have a checkpoint and doing more incremental changes. |
Ah, yes, that's definitely very wise. I'll update the changelog and do the merges. |
* Bump minor version * Do not take an initial step before starting the chain in HMC (#2674) * Do not take an initial step before starting the chain in HMC * Fix some tests * update changelog * Compatibility with DynamicPPL 0.38 + InitContext (#2676) * Import `varname_leaves` etc from AbstractPPL instead * initial updates for InitContext * More fixes * Fix pMCMC * Fix Gibbs * More fixes, reexport InitFrom * Fix a bunch of tests; I'll let CI tell me what's still broken... * Remove comment * Fix more tests * More test fixes * Fix more tests * fix GeneralizedExtremeValue numerical test * fix sample method * fix ESS reproducibility * Fix externalsampler test correctly * Fix everything (I _think_) * Add changelog * Fix remaining tests (for real this time) * Specify default chain type in Turing * fix DPPL revision * Fix changelog to mention unwrapped NT / Dict for initial_params * Remove references to islinked, set_flag, unset_flag * use `setleafcontext(::Model, ::AbstractContext)` * Fix for upstream removal of default_chain_type * Add clarifying comment for IS test * Revert ESS test (and add some numerical accuracy checks) * istrans -> is_transformed * Remove `loadstate` and `resume_from` * Remove a Sampler test * Paper over one crack * fix `resume_from` * remove a `Sampler` test * Update HISTORY.md Co-authored-by: Markus Hauru <[email protected]> * Remove `Sampler`, remove `InferenceAlgorithm`, transfer `initialstep`, `init_strategy`, and other functions from DynamicPPL to Turing (#2689) * Remove `Sampler` and move its interface to Turing * Test fixes (this is admittedly quite tiring) * Fix a couple of Gibbs tests (no doubt there are more) * actually fix the Gibbs ones * actually fix it this time * fix typo * point to breaking * Improve loadstate implementation * Re-add tests that were removed from DynamicPPL * Fix qualifier in src/mcmc/external_sampler.jl Co-authored-by: Xianda Sun <[email protected]> * Remove the default argument for initial_params * Remove DynamicPPL sources --------- Co-authored-by: Xianda Sun <[email protected]> * Fix a word in changelog * Improve changelog * Add PNTDist to changelog --------- Co-authored-by: Markus Hauru <[email protected]> Co-authored-by: Xianda Sun <[email protected]> * Fix all docs warnings --------- Co-authored-by: Markus Hauru <[email protected]> Co-authored-by: Markus Hauru <[email protected]> Co-authored-by: Xianda Sun <[email protected]>
It should be noted that due to the changes in DynamicPPL's
src/sampler.jl
, the results of running MCMC sampling on this branch will pretty much always differ from that on the main branch. Thus there is no (easy) way to test full reproducibility of MCMC results (we have to rely instead on statistics for converged chains).TODO:
Separate PRs:
use InitStrategy for optimisation as well
Note that the three pre-existing InitStrategies can be used directly with optimisation. However, to handle constraints properly, it seems necessary to introduce a new subtype of AbstractInitStrategy. I think this should be a separate PR because it's a fair bit of work.
fix docs for that argument, wherever it is (there's probably some in AbstractMCMC but it should probably be documented on the main site) EDIT: https://turinglang.org/docs/usage/sampling-options/#specifying-initial-parameters