Skip to content
Open
Show file tree
Hide file tree
Changes from 48 commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
8c3d30f
v0.39
penelopeysm Oct 21, 2025
262d732
Merge remote-tracking branch 'origin/main' into breaking
mhauru Oct 22, 2025
77af4eb
Merge branch 'main' into breaking
penelopeysm Oct 28, 2025
c57de02
Merge remote-tracking branch 'origin/main' into breaking
mhauru Oct 31, 2025
7300c22
Update DPPL compats for benchmarks and docs
mhauru Oct 31, 2025
79150ba
remove merge conflict markers
penelopeysm Nov 4, 2025
6dc7c02
Merge remote-tracking branch 'origin/main' into breaking
mhauru Nov 5, 2025
2ca96cc
Merge branch 'main' into breaking
penelopeysm Nov 5, 2025
a8eb2e7
Merge branch 'main' into breaking
penelopeysm Nov 11, 2025
4ca9528
Remove `NodeTrait` (#1133)
penelopeysm Nov 11, 2025
535ce4f
FastLDF / InitContext unified (#1132)
penelopeysm Nov 13, 2025
9624103
implement `LogDensityProblems.dimension`
penelopeysm Nov 14, 2025
ce80713
forgot about capabilities...
penelopeysm Nov 14, 2025
1d21728
Merge branch 'main' into breaking
penelopeysm Nov 18, 2025
c4cec0b
Merge branch 'main' into breaking
penelopeysm Nov 18, 2025
8553e40
use interpolation in run_ad
penelopeysm Nov 18, 2025
3cd8d34
Improvements to benchmark outputs (#1146)
penelopeysm Nov 18, 2025
eab7131
Add VarNamedTuple, tests, and WIP docs
mhauru Nov 19, 2025
0c7825b
Add comparisons and merge
mhauru Nov 20, 2025
15d5a8a
Start using VNT in FastLDF
mhauru Nov 20, 2025
871eb9f
Move _compose_no_identity to utils.jl
mhauru Nov 20, 2025
4a11560
Allow generation of `ParamsWithStats` from `FastLDF` plus parameters,…
penelopeysm Nov 22, 2025
766f663
Make FastLDF the default (#1139)
penelopeysm Nov 25, 2025
c1b935b
Minor refactor
mhauru Nov 25, 2025
262a6f9
Remove IndexDict
mhauru Nov 25, 2025
abea087
Remove make_leaf as a field
mhauru Nov 25, 2025
5900f69
Document, refactor, and fix PartialArray
mhauru Nov 25, 2025
8f17dcf
Make PartialArray more type stable.
mhauru Nov 25, 2025
8547e25
Implement `predict`, `returned`, `logjoint`, ... with `OnlyAccsVarInf…
penelopeysm Nov 25, 2025
04b3383
Fixes and improvements to VNT
mhauru Nov 25, 2025
59c4dcb
Proper printing and constructors
mhauru Nov 25, 2025
381b1dd
Fix PartialArray printing
mhauru Nov 25, 2025
88db66d
Update the design doc
mhauru Nov 25, 2025
a6d56a2
Improve FastLDF type stability when all parameters are linked or unli…
penelopeysm Nov 27, 2025
018d14f
Merge remote-tracking branch 'origin/breaking' into mhauru/vnt-for-fa…
mhauru Nov 27, 2025
eca65d5
Fix a test
mhauru Nov 27, 2025
5e27a05
Fix copy and show
mhauru Nov 27, 2025
050b8c5
Add test_invariants to VNT tests
mhauru Nov 27, 2025
f5616df
Improve VNT internal docs
mhauru Nov 27, 2025
ec5dc8f
Polish VNT
mhauru Nov 27, 2025
3ca36c4
Make VNT merge type stable. Simplify printing, improve tests.
mhauru Nov 27, 2025
59f67fd
Add VNT too API docs
mhauru Nov 27, 2025
9aba468
Fix doctests
mhauru Nov 27, 2025
0b4c772
Clean up tests a bit
mhauru Nov 27, 2025
38662a8
Fix API docs
mhauru Nov 27, 2025
e41afca
Fix a bug and a docstring
mhauru Nov 27, 2025
8c50bbb
Apply suggestions from code review
mhauru Nov 28, 2025
cae8864
Fix VNT docs
mhauru Nov 28, 2025
c27f5e0
Make threadsafe evaluation opt-in (#1151)
penelopeysm Dec 1, 2025
993cc5b
Merge remote-tracking branch 'origin/main' into breaking
penelopeysm Dec 1, 2025
54ae7e3
Standardise `:lp` -> `:logjoint` (#1161)
penelopeysm Dec 1, 2025
6012b11
Merge branch 'tmp2' into mhauru/vnt-for-fastldf
mhauru Dec 3, 2025
f114e40
Merge commit 'ee863d6' into mhauru/vnt-for-fastldf
mhauru Dec 3, 2025
bccfdf0
Merge branch 'breaking' into mhauru/vnt-for-fastldf
mhauru Dec 3, 2025
384e3ac
Apply suggestions from code review
mhauru Dec 3, 2025
9d61a54
Add a microoptimisation
mhauru Dec 3, 2025
8c8e39f
Improve docstrings
mhauru Dec 3, 2025
c818bf8
Simplify use of QuoteNodes
mhauru Dec 3, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 40 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,45 @@
# DynamicPPL Changelog

## 0.39.0

### Breaking changes

#### Fast Log Density Functions

This version provides a reimplementation of `LogDensityFunction` that provides performance improvements on the order of 2–10× for both model evaluation as well as automatic differentiation.
Exact speedups depend on the model size: larger models have less significant speedups because the bulk of the work is done in calls to `logpdf`.

For more information about how this is accomplished, please see https://github.com/TuringLang/DynamicPPL.jl/pull/1113 as well as the `src/fasteval.jl` file, which contains extensive comments.

As a result of this change, `LogDensityFunction` no longer stores a VarInfo inside it.
In general, if `ldf` is a `LogDensityFunction`, it is now only valid to access `ldf.model` and `ldf.adtype`.
If you were previously relying on this behaviour, you will need to store a VarInfo separately.

#### Parent and leaf contexts

The `DynamicPPL.NodeTrait` function has been removed.
Instead of implementing this, parent contexts should subtype `DynamicPPL.AbstractParentContext`.
This is an abstract type which requires you to overload two functions, `DynamicPPL.childcontext` and `DynamicPPL.setchildcontext`.

There should generally be few reasons to define your own parent contexts (the only one we are aware of, outside of DynamicPPL itself, is `Turing.Inference.GibbsContext`), so this change should not really affect users.

Leaf contexts require no changes, apart from a removal of the `NodeTrait` function.

`ConditionContext` and `PrefixContext` are no longer exported.
You should not need to use these directly, please use `AbstractPPL.condition` and `DynamicPPL.prefix` instead.

#### Miscellaneous

Removed the method `returned(::Model, values, keys)`; please use `returned(::Model, ::AbstractDict{<:VarName})` instead.

The unexported functions `supports_varname_indexing(chain)`, `getindex_varname(chain)`, and `varnames(chain)` have been removed.

The method `DynamicPPL.init` (for implementing `AbstractInitStrategy`) now has a different signature: it must return a tuple of the generated value, plus a transform function that maps it back to unlinked space.
This is a generalisation of the previous behaviour, where `init` would always return an unlinked value (in effect forcing the transform to be the identity function).

The family of functions `returned(model, chain)`, along with the same signatures of `pointwise_logdensities`, `logjoint`, `loglikelihood`, and `logprior`, have been changed such that if the chain does not contain all variables in the model, an error is thrown.
Previously the behaviour would have been to sample missing variables.

## 0.38.9

Remove warning when using Enzyme as the AD backend.
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "DynamicPPL"
uuid = "366bfd00-2699-11ea-058f-f148b4cae6d8"
version = "0.38.9"
version = "0.39.0"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ DynamicPPL = {path = "../"}
ADTypes = "1.14.0"
Chairmarks = "1.3.1"
Distributions = "0.25.117"
DynamicPPL = "0.38"
DynamicPPL = "0.39"
Enzyme = "0.13"
ForwardDiff = "1"
JSON = "1.3.0"
Expand Down
38 changes: 33 additions & 5 deletions benchmarks/benchmarks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -98,12 +98,15 @@ function run(; to_json=false)
}[]

for (model_name, model, varinfo_choice, adbackend, islinked) in chosen_combinations
@info "Running benchmark for $model_name"
@info "Running benchmark for $model_name, $varinfo_choice, $adbackend, $islinked"
relative_eval_time, relative_ad_eval_time = try
results = benchmark(model, varinfo_choice, adbackend, islinked)
@info " t(eval) = $(results.primal_time)"
@info " t(grad) = $(results.grad_time)"
(results.primal_time / reference_time),
(results.grad_time / results.primal_time)
catch e
@info "benchmark errored: $e"
missing, missing
end
push!(
Expand Down Expand Up @@ -155,18 +158,33 @@ function combine(head_filename::String, base_filename::String)
all_testcases = union(Set(keys(head_testcases)), Set(keys(base_testcases)))
@info "$(length(all_testcases)) unique test cases found"
sorted_testcases = sort(
collect(all_testcases); by=(c -> (c.model_name, c.ad_backend, c.varinfo, c.linked))
collect(all_testcases); by=(c -> (c.model_name, c.linked, c.varinfo, c.ad_backend))
)
results_table = Tuple{
String,Int,String,String,Bool,String,String,String,String,String,String
String,
Int,
String,
String,
Bool,
String,
String,
String,
String,
String,
String,
String,
String,
String,
}[]
sublabels = ["base", "this PR", "speedup"]
results_colnames = [
[
EmptyCells(5),
MultiColumn(3, "t(eval) / t(ref)"),
MultiColumn(3, "t(grad) / t(eval)"),
MultiColumn(3, "t(grad) / t(ref)"),
],
[colnames[1:5]..., "base", "this PR", "speedup", "base", "this PR", "speedup"],
[colnames[1:5]..., sublabels..., sublabels..., sublabels...],
]
sprint_float(x::Float64) = @sprintf("%.2f", x)
sprint_float(m::Missing) = "err"
Expand All @@ -183,6 +201,10 @@ function combine(head_filename::String, base_filename::String)
# Finally that lets us do this division safely
speedup_eval = base_eval / head_eval
speedup_grad = base_grad / head_grad
# As well as this multiplication, which is t(grad) / t(ref)
head_grad_vs_ref = head_grad * head_eval
base_grad_vs_ref = base_grad * base_eval
speedup_grad_vs_ref = base_grad_vs_ref / head_grad_vs_ref
push!(
results_table,
(
Expand All @@ -197,6 +219,9 @@ function combine(head_filename::String, base_filename::String)
sprint_float(base_grad),
sprint_float(head_grad),
sprint_float(speedup_grad),
sprint_float(base_grad_vs_ref),
sprint_float(head_grad_vs_ref),
sprint_float(speedup_grad_vs_ref),
),
)
end
Expand All @@ -212,7 +237,10 @@ function combine(head_filename::String, base_filename::String)
backend=:text,
fit_table_in_display_horizontally=false,
fit_table_in_display_vertically=false,
table_format=TextTableFormat(; horizontal_line_at_merged_column_labels=true),
table_format=TextTableFormat(;
horizontal_line_at_merged_column_labels=true,
horizontal_lines_at_data_rows=collect(3:3:length(results_table)),
),
)
println("```")
end
Expand Down
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Accessors = "0.1"
Distributions = "0.25"
Documenter = "1"
DocumenterMermaid = "0.1, 0.2"
DynamicPPL = "0.38"
DynamicPPL = "0.39"
FillArrays = "0.13, 1"
ForwardDiff = "0.10, 1"
JET = "0.9, 0.10, 0.11"
Expand Down
64 changes: 55 additions & 9 deletions docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,12 @@ The [LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl) inte
LogDensityFunction
```

Internally, this is accomplished using [`init!!`](@ref) on:

```@docs
OnlyAccsVarInfo
```

## Condition and decondition

A [`Model`](@ref) can be conditioned on a set of observations with [`AbstractPPL.condition`](@ref) or its alias [`|`](@ref).
Expand Down Expand Up @@ -170,6 +176,12 @@ DynamicPPL.prefix

## Utilities

`typed_identity` is the same as `identity`, but with an overload for `with_logabsdet_jacobian` that ensures that it never errors.

```@docs
typed_identity
```

It is possible to manually increase (or decrease) the accumulated log likelihood or prior from within a model function.

```@docs
Expand Down Expand Up @@ -352,11 +364,10 @@ Base.empty!
SimpleVarInfo
```

### Tilde-pipeline
#### `VarNamedTuple`

```@docs
tilde_assume!!
tilde_observe!!
DynamicPPL.VarNamedTuples.VarNamedTuple
```

### Accumulators
Expand Down Expand Up @@ -463,22 +474,55 @@ By default, it does not perform any actual sampling: it only evaluates the model
If you wish to sample new values, see the section on [VarInfo initialisation](#VarInfo-initialisation) just below this.

The behaviour of a model execution can be changed with evaluation contexts, which are a field of the model.
Contexts are subtypes of `AbstractPPL.AbstractContext`.

All contexts are subtypes of `AbstractPPL.AbstractContext`.

Contexts are split into two kinds:

**Leaf contexts**: These are the most important contexts as they ultimately decide how model evaluation proceeds.
For example, `DefaultContext` evaluates the model using values stored inside a VarInfo's metadata, whereas `InitContext` obtains new values either by sampling or from a known set of parameters.
DynamicPPL has more leaf contexts which are used for internal purposes, but these are the two that are exported.

```@docs
DefaultContext
PrefixContext
ConditionContext
InitContext
```

To implement a leaf context, you need to subtype `AbstractPPL.AbstractContext` and implement the `tilde_assume!!` and `tilde_observe!!` methods for your context.

```@docs
tilde_assume!!
tilde_observe!!
```

**Parent contexts**: These essentially act as 'modifiers' for leaf contexts.
For example, `PrefixContext` adds a prefix to all variable names during evaluation, while `ConditionContext` marks certain variables as observed.

To implement a parent context, you have to subtype `DynamicPPL.AbstractParentContext`, and implement the `childcontext` and `setchildcontext` methods.
If needed, you can also implement `tilde_assume!!` and `tilde_observe!!` for your context.
This is optional; the default implementation is to simply delegate to the child context.

```@docs
AbstractParentContext
childcontext
setchildcontext
```

Since contexts form a tree structure, these functions are automatically defined for manipulating context stacks.
They are mainly useful for modifying the fundamental behaviour (i.e. the leaf context), without affecting any of the modifiers (i.e. parent contexts).

```@docs
leafcontext
setleafcontext
```

### VarInfo initialisation

The function `init!!` is used to initialise, or overwrite, values in a VarInfo.
It is really a thin wrapper around using `evaluate!!` with an `InitContext`.

```@docs
DynamicPPL.init!!
init!!
```

To accomplish this, an initialisation _strategy_ is required, which defines how new values are to be obtained.
Expand All @@ -491,10 +535,12 @@ InitFromParams
```

If you wish to write your own, you have to subtype [`DynamicPPL.AbstractInitStrategy`](@ref) and implement the `init` method.
In very rare situations, you may also need to implement `get_param_eltype`, which defines the element type of the parameters generated by the strategy.

```@docs
DynamicPPL.AbstractInitStrategy
DynamicPPL.init
AbstractInitStrategy
init
get_param_eltype
```

### Choosing a suitable VarInfo
Expand Down
Loading