Skip to content
Merged

v0.39 #1082

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
8c3d30f
v0.39
penelopeysm Oct 21, 2025
262d732
Merge remote-tracking branch 'origin/main' into breaking
mhauru Oct 22, 2025
77af4eb
Merge branch 'main' into breaking
penelopeysm Oct 28, 2025
c57de02
Merge remote-tracking branch 'origin/main' into breaking
mhauru Oct 31, 2025
7300c22
Update DPPL compats for benchmarks and docs
mhauru Oct 31, 2025
79150ba
remove merge conflict markers
penelopeysm Nov 4, 2025
6dc7c02
Merge remote-tracking branch 'origin/main' into breaking
mhauru Nov 5, 2025
2ca96cc
Merge branch 'main' into breaking
penelopeysm Nov 5, 2025
a8eb2e7
Merge branch 'main' into breaking
penelopeysm Nov 11, 2025
4ca9528
Remove `NodeTrait` (#1133)
penelopeysm Nov 11, 2025
535ce4f
FastLDF / InitContext unified (#1132)
penelopeysm Nov 13, 2025
9624103
implement `LogDensityProblems.dimension`
penelopeysm Nov 14, 2025
ce80713
forgot about capabilities...
penelopeysm Nov 14, 2025
1d21728
Merge branch 'main' into breaking
penelopeysm Nov 18, 2025
c4cec0b
Merge branch 'main' into breaking
penelopeysm Nov 18, 2025
8553e40
use interpolation in run_ad
penelopeysm Nov 18, 2025
3cd8d34
Improvements to benchmark outputs (#1146)
penelopeysm Nov 18, 2025
4a11560
Allow generation of `ParamsWithStats` from `FastLDF` plus parameters,…
penelopeysm Nov 22, 2025
766f663
Make FastLDF the default (#1139)
penelopeysm Nov 25, 2025
8547e25
Implement `predict`, `returned`, `logjoint`, ... with `OnlyAccsVarInf…
penelopeysm Nov 25, 2025
a6d56a2
Improve FastLDF type stability when all parameters are linked or unli…
penelopeysm Nov 27, 2025
c27f5e0
Make threadsafe evaluation opt-in (#1151)
penelopeysm Dec 1, 2025
993cc5b
Merge remote-tracking branch 'origin/main' into breaking
penelopeysm Dec 1, 2025
54ae7e3
Standardise `:lp` -> `:logjoint` (#1161)
penelopeysm Dec 1, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
81 changes: 81 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,86 @@
# DynamicPPL Changelog

## 0.39.0

### Breaking changes

#### Fast Log Density Functions

This version provides a reimplementation of `LogDensityFunction` that provides performance improvements on the order of 2–10× for both model evaluation as well as automatic differentiation.
Exact speedups depend on the model size: larger models have less significant speedups because the bulk of the work is done in calls to `logpdf`.

For more information about how this is accomplished, please see https://github.com/TuringLang/DynamicPPL.jl/pull/1113 as well as the `src/logdensityfunction.jl` file, which contains extensive comments.

As a result of this change, `LogDensityFunction` no longer stores a VarInfo inside it.
In general, if `ldf` is a `LogDensityFunction`, it is now only valid to access `ldf.model` and `ldf.adtype`.
If you were previously relying on this behaviour, you will need to store a VarInfo separately.

#### Threadsafe evaluation

DynamicPPL models have traditionally supported running some probabilistic statements (e.g. tilde-statements, or `@addlogprob!`) in parallel.
Prior to DynamicPPL 0.39, thread safety for such models used to be enabled by default if Julia was launched with more than one thread.

In DynamicPPL 0.39, **thread-safe evaluation is now disabled by default**.
If you need it (see below for more discussion of when you _do_ need it), you **must** now manually mark it as so, using:

```julia
@model f() = ...
model = f()
model = setthreadsafe(model, true)
```

The problem with the previous on-by-default is that it can sacrifice a huge amount of performance when thread safety is not needed.
This is especially true when running Julia in a notebook, where multiple threads are often enabled by default.
Furthermore, it is not actually the correct approach: just because Julia has multiple threads does not mean that a particular model actually requires threadsafe evaluation.

**A model requires threadsafe evaluation if, and only if, the VarInfo object used inside the model is manipulated in parallel.**
This can occur if any of the following are inside `Threads.@threads` or other concurrency functions / macros:

- tilde-statements
- calls to `@addlogprob!`
- any direct manipulation of the special `__varinfo__` variable

If you have none of these inside threaded blocks, then you do not need to mark your model as threadsafe.
**Notably, the following do not require threadsafe evaluation:**

- Using threading for any computation that does not involve VarInfo. For example, you can calculate a log-probability in parallel, and then add it using `@addlogprob!` outside of the threaded block. This does not require threadsafe evaluation.
- Sampling with `AbstractMCMC.MCMCThreads()`.

For more information about threadsafe evaluation, please see [the Turing docs](https://turinglang.org/docs/usage/threadsafe-evaluation/).

When threadsafe evaluation is enabled for a model, an internal flag is set on the model.
The value of this flag can be queried using `DynamicPPL.requires_threadsafe(model)`, which returns a boolean.
This function is newly exported in this version of DynamicPPL.

#### Parent and leaf contexts

The `DynamicPPL.NodeTrait` function has been removed.
Instead of implementing this, parent contexts should subtype `DynamicPPL.AbstractParentContext`.
This is an abstract type which requires you to overload two functions, `DynamicPPL.childcontext` and `DynamicPPL.setchildcontext`.

There should generally be few reasons to define your own parent contexts (the only one we are aware of, outside of DynamicPPL itself, is `Turing.Inference.GibbsContext`), so this change should not really affect users.

Leaf contexts require no changes, apart from a removal of the `NodeTrait` function.

`ConditionContext` and `PrefixContext` are no longer exported.
You should not need to use these directly, please use `AbstractPPL.condition` and `DynamicPPL.prefix` instead.

#### ParamsWithStats

In the 'stats' part of `DynamicPPL.ParamsWithStats`, the log-joint is now consistently represented with the key `logjoint` instead of `lp`.

#### Miscellaneous

Removed the method `returned(::Model, values, keys)`; please use `returned(::Model, ::AbstractDict{<:VarName})` instead.

The unexported functions `supports_varname_indexing(chain)`, `getindex_varname(chain)`, and `varnames(chain)` have been removed.

The method `DynamicPPL.init` (for implementing `AbstractInitStrategy`) now has a different signature: it must return a tuple of the generated value, plus a transform function that maps it back to unlinked space.
This is a generalisation of the previous behaviour, where `init` would always return an unlinked value (in effect forcing the transform to be the identity function).

The family of functions `returned(model, chain)`, along with the same signatures of `pointwise_logdensities`, `logjoint`, `loglikelihood`, and `logprior`, have been changed such that if the chain does not contain all variables in the model, an error is thrown.
Previously the behaviour would have been to sample missing variables.

## 0.38.10

`returned(model, chain)` and `pointwise_logdensities(model, chain)` will now error if a value for a random variable cannot be found in the chain.
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "DynamicPPL"
uuid = "366bfd00-2699-11ea-058f-f148b4cae6d8"
version = "0.38.10"
version = "0.39.0"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ DynamicPPL = {path = "../"}
ADTypes = "1.14.0"
Chairmarks = "1.3.1"
Distributions = "0.25.117"
DynamicPPL = "0.38"
DynamicPPL = "0.39"
Enzyme = "0.13"
ForwardDiff = "1"
JSON = "1.3.0"
Expand Down
38 changes: 33 additions & 5 deletions benchmarks/benchmarks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -98,12 +98,15 @@ function run(; to_json=false)
}[]

for (model_name, model, varinfo_choice, adbackend, islinked) in chosen_combinations
@info "Running benchmark for $model_name"
@info "Running benchmark for $model_name, $varinfo_choice, $adbackend, $islinked"
relative_eval_time, relative_ad_eval_time = try
results = benchmark(model, varinfo_choice, adbackend, islinked)
@info " t(eval) = $(results.primal_time)"
@info " t(grad) = $(results.grad_time)"
(results.primal_time / reference_time),
(results.grad_time / results.primal_time)
catch e
@info "benchmark errored: $e"
missing, missing
end
push!(
Expand Down Expand Up @@ -155,18 +158,33 @@ function combine(head_filename::String, base_filename::String)
all_testcases = union(Set(keys(head_testcases)), Set(keys(base_testcases)))
@info "$(length(all_testcases)) unique test cases found"
sorted_testcases = sort(
collect(all_testcases); by=(c -> (c.model_name, c.ad_backend, c.varinfo, c.linked))
collect(all_testcases); by=(c -> (c.model_name, c.linked, c.varinfo, c.ad_backend))
)
results_table = Tuple{
String,Int,String,String,Bool,String,String,String,String,String,String
String,
Int,
String,
String,
Bool,
String,
String,
String,
String,
String,
String,
String,
String,
String,
}[]
sublabels = ["base", "this PR", "speedup"]
results_colnames = [
[
EmptyCells(5),
MultiColumn(3, "t(eval) / t(ref)"),
MultiColumn(3, "t(grad) / t(eval)"),
MultiColumn(3, "t(grad) / t(ref)"),
],
[colnames[1:5]..., "base", "this PR", "speedup", "base", "this PR", "speedup"],
[colnames[1:5]..., sublabels..., sublabels..., sublabels...],
]
sprint_float(x::Float64) = @sprintf("%.2f", x)
sprint_float(m::Missing) = "err"
Expand All @@ -183,6 +201,10 @@ function combine(head_filename::String, base_filename::String)
# Finally that lets us do this division safely
speedup_eval = base_eval / head_eval
speedup_grad = base_grad / head_grad
# As well as this multiplication, which is t(grad) / t(ref)
head_grad_vs_ref = head_grad * head_eval
base_grad_vs_ref = base_grad * base_eval
speedup_grad_vs_ref = base_grad_vs_ref / head_grad_vs_ref
push!(
results_table,
(
Expand All @@ -197,6 +219,9 @@ function combine(head_filename::String, base_filename::String)
sprint_float(base_grad),
sprint_float(head_grad),
sprint_float(speedup_grad),
sprint_float(base_grad_vs_ref),
sprint_float(head_grad_vs_ref),
sprint_float(speedup_grad_vs_ref),
),
)
end
Expand All @@ -212,7 +237,10 @@ function combine(head_filename::String, base_filename::String)
backend=:text,
fit_table_in_display_horizontally=false,
fit_table_in_display_vertically=false,
table_format=TextTableFormat(; horizontal_line_at_merged_column_labels=true),
table_format=TextTableFormat(;
horizontal_line_at_merged_column_labels=true,
horizontal_lines_at_data_rows=collect(3:3:length(results_table)),
),
)
println("```")
end
Expand Down
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Accessors = "0.1"
Distributions = "0.25"
Documenter = "1"
DocumenterMermaid = "0.1, 0.2"
DynamicPPL = "0.38"
DynamicPPL = "0.39"
FillArrays = "0.13, 1"
ForwardDiff = "0.10, 1"
JET = "0.9, 0.10, 0.11"
Expand Down
74 changes: 61 additions & 13 deletions docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,14 @@ The context of a model can be set using [`contextualize`](@ref):
contextualize
```

Some models require threadsafe evaluation (see [the Turing docs](https://turinglang.org/docs/usage/threadsafe-evaluation/) for more information on when this is necessary).
If this is the case, one must enable threadsafe evaluation for a model:

```@docs
setthreadsafe
requires_threadsafe
```

## Evaluation

With [`rand`](@ref) one can draw samples from the prior distribution of a [`Model`](@ref).
Expand All @@ -66,6 +74,12 @@ The [LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl) inte
LogDensityFunction
```

Internally, this is accomplished using [`init!!`](@ref) on:

```@docs
OnlyAccsVarInfo
```

## Condition and decondition

A [`Model`](@ref) can be conditioned on a set of observations with [`AbstractPPL.condition`](@ref) or its alias [`|`](@ref).
Expand Down Expand Up @@ -170,6 +184,12 @@ DynamicPPL.prefix

## Utilities

`typed_identity` is the same as `identity`, but with an overload for `with_logabsdet_jacobian` that ensures that it never errors.

```@docs
typed_identity
```

It is possible to manually increase (or decrease) the accumulated log likelihood or prior from within a model function.

```@docs
Expand Down Expand Up @@ -352,13 +372,6 @@ Base.empty!
SimpleVarInfo
```

### Tilde-pipeline

```@docs
tilde_assume!!
tilde_observe!!
```

### Accumulators

The subtypes of [`AbstractVarInfo`](@ref) store the cumulative log prior and log likelihood, and sometimes other variables that change during executing, in what are called accumulators.
Expand Down Expand Up @@ -463,22 +476,55 @@ By default, it does not perform any actual sampling: it only evaluates the model
If you wish to sample new values, see the section on [VarInfo initialisation](#VarInfo-initialisation) just below this.

The behaviour of a model execution can be changed with evaluation contexts, which are a field of the model.
Contexts are subtypes of `AbstractPPL.AbstractContext`.

All contexts are subtypes of `AbstractPPL.AbstractContext`.

Contexts are split into two kinds:

**Leaf contexts**: These are the most important contexts as they ultimately decide how model evaluation proceeds.
For example, `DefaultContext` evaluates the model using values stored inside a VarInfo's metadata, whereas `InitContext` obtains new values either by sampling or from a known set of parameters.
DynamicPPL has more leaf contexts which are used for internal purposes, but these are the two that are exported.

```@docs
DefaultContext
PrefixContext
ConditionContext
InitContext
```

To implement a leaf context, you need to subtype `AbstractPPL.AbstractContext` and implement the `tilde_assume!!` and `tilde_observe!!` methods for your context.

```@docs
tilde_assume!!
tilde_observe!!
```

**Parent contexts**: These essentially act as 'modifiers' for leaf contexts.
For example, `PrefixContext` adds a prefix to all variable names during evaluation, while `ConditionContext` marks certain variables as observed.

To implement a parent context, you have to subtype `DynamicPPL.AbstractParentContext`, and implement the `childcontext` and `setchildcontext` methods.
If needed, you can also implement `tilde_assume!!` and `tilde_observe!!` for your context.
This is optional; the default implementation is to simply delegate to the child context.

```@docs
AbstractParentContext
childcontext
setchildcontext
```

Since contexts form a tree structure, these functions are automatically defined for manipulating context stacks.
They are mainly useful for modifying the fundamental behaviour (i.e. the leaf context), without affecting any of the modifiers (i.e. parent contexts).

```@docs
leafcontext
setleafcontext
```

### VarInfo initialisation

The function `init!!` is used to initialise, or overwrite, values in a VarInfo.
It is really a thin wrapper around using `evaluate!!` with an `InitContext`.

```@docs
DynamicPPL.init!!
init!!
```

To accomplish this, an initialisation _strategy_ is required, which defines how new values are to be obtained.
Expand All @@ -491,10 +537,12 @@ InitFromParams
```

If you wish to write your own, you have to subtype [`DynamicPPL.AbstractInitStrategy`](@ref) and implement the `init` method.
In very rare situations, you may also need to implement `get_param_eltype`, which defines the element type of the parameters generated by the strategy.

```@docs
DynamicPPL.AbstractInitStrategy
DynamicPPL.init
AbstractInitStrategy
init
get_param_eltype
```

### Choosing a suitable VarInfo
Expand Down
13 changes: 6 additions & 7 deletions ext/DynamicPPLEnzymeCoreExt.jl
Original file line number Diff line number Diff line change
@@ -1,16 +1,15 @@
module DynamicPPLEnzymeCoreExt

if isdefined(Base, :get_extension)
using DynamicPPL: DynamicPPL
using EnzymeCore
else
using ..DynamicPPL: DynamicPPL
using ..EnzymeCore
end
using DynamicPPL: DynamicPPL
using EnzymeCore

# Mark is_transformed as having 0 derivative. The `nothing` return value is not significant, Enzyme
# only checks whether such a method exists, and never runs it.
@inline EnzymeCore.EnzymeRules.inactive(::typeof(DynamicPPL.is_transformed), args...) =
nothing
# Likewise for get_range_and_linked.
@inline EnzymeCore.EnzymeRules.inactive(
::typeof(DynamicPPL._get_range_and_linked), args...
) = nothing

end
Loading