Skip to content

Commit 28664de

Browse files
committed
add symbolic ude tutorial
1 parent 395a619 commit 28664de

File tree

2 files changed

+98
-1
lines changed

2 files changed

+98
-1
lines changed

docs/make.jl

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,8 @@ makedocs(;
2121
pages = [
2222
"Home" => "index.md",
2323
"Tutorials" => ["NeuralNetworkBlock" => "nnblock.md"
24-
"Friction Model" => "friction.md"],
24+
"Friction Model" => "friction.md",
25+
"Symbolic UDE Creation" => "symbolic_ude_tutorial.md"],
2526
"API" => "api.md"
2627
]
2728
)

docs/src/symbolic_ude_tutorial.md

Lines changed: 96 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
# Symbolic UDE Creation
2+
3+
This tutorial will demonstrate a simple interface for symbolic declaration neural networks that can be directly added to ModelingToolkit-declared ODE models to create UDEs. For our example we will use a simple self-activation loop model, however, it can be easily generalised to more model types.
4+
5+
### Ground truth model and synthetic data generation
6+
7+
First we create the ground-truth model using ModelingToolkit. In it, `Y` activates `X` at the rate `v * (Y^n) / (K^n + Y^n)`. Later on, we will attempt to learn this rate using a neural network. Both variables decay at constant rates that scales with the parameter `d`.
8+
```@example symbolic_ude
9+
using ModelingToolkit
10+
using ModelingToolkit: t_nounits as t, D_nounits as D
11+
@variables X(t) Y(t)
12+
@parameters v = 1.0 K = 1.0 n = 1.0 d = 1.0 # Sets unused default values for all parameters (but vaguely useful as potential optimization initial conditions).
13+
eqs = [
14+
D(X) ~ v * (Y^n) / (K^n + Y^n) - d*X
15+
D(Y) ~ X - d*Y
16+
]
17+
@mtkcompile xy_model = System(eqs, t)
18+
```
19+
20+
Next, we simulate our model for a true parameter set (which we wish to recover).
21+
```@example symbolic_ude
22+
using OrdinaryDiffEqDefault, Plots
23+
u0 = [X => 2.0, Y => 0.1]
24+
ps_true = [v => 1.1, K => 2.0, n => 3.0, d => 0.5]
25+
sim_cond = [u0; ps_true]
26+
tend = 45.0
27+
oprob_true = ODEProblem(xy_model, sim_cond, (0.0, tend))
28+
sol_true = solve(oprob_true)
29+
plot(sol_true; lw = 6, idxs = [X, Y])
30+
```
31+
32+
Finally, we generate noisy measured samples from both `X` and `Y` (to which we will fir the UDE).
33+
```@example symbolic_ude
34+
sample_t = range(0.0, tend; length = 20)
35+
sample_X = [(0.8 + 0.4rand()) * X for X in sol_true(sample_t; idxs = X)]
36+
sample_Y = [(0.8 + 0.4rand()) * Y for Y in sol_true(sample_t; idxs = Y)]
37+
plot!(sample_t, sample_X, seriestype = :scatter, label = "X (data)", color = 1, ms = 6, alpha = 0.7)
38+
plot!(sample_t, sample_Y, seriestype = :scatter, label = "Y (data)", color = 2, ms = 6, alpha = 0.7)
39+
```
40+
41+
### UDE declaration and training
42+
First, we used Lux.jl to declare the neural network we wish to use for our UDE. For this case, we can use a fairly small network. We use `softplus` throughout the network we ensure that the fitted UDE function is positive (for our application this is the case, however, it might not always be true).
43+
```@example symbolic_ude
44+
using Lux
45+
nn_arch = Lux.Chain(
46+
Lux.Dense(1 => 3, Lux.softplus, use_bias = false),
47+
Lux.Dense(3 => 3, Lux.softplus, use_bias = false),
48+
Lux.Dense(3 => 1, Lux.softplus, use_bias = false)
49+
)
50+
```
51+
52+
Next, we can use ModelingToolkitNeuralNets to turn our neural network to a Symbolic neural network representation (which can later be inserted into an ModelingToolkit model).
53+
```@example symbolic_ude
54+
using ModelingToolkitNeuralNets
55+
sym_nn, θ = SymbolicNeuralNetwork(; nn_p_name = :θ, chain = nn_arch, n_input = 1, n_output = 1)
56+
sym_nn_func(x) = sym_nn([x], θ)[1]
57+
```
58+
59+
Now we can create our UDE. We replace the (from now on unknown) function `v * (Y^n) / (K^n + Y^n)` with our symbolic neural network (which we let be a function of the variable `Y` only).
60+
```@example symbolic_ude
61+
eqs_ude = [
62+
D(X) ~ sym_nn_func(Y) - d*X
63+
D(Y) ~ X - d*Y
64+
]
65+
@mtkcompile xy_model_ude = System(eqs_ude, t)
66+
```
67+
68+
We can now fit our UDE model (including the neural network and the parameter d) to the data. First, we define a loss function which compares the UDE's simulation to the data.
69+
```@example symbolic_ude
70+
71+
```
72+
73+
Next, we use Optimization.jl to create an `OptimizationProblem`. This uses a similar syntax to normal parameter inference workflows, however, we need to add the entire neural network parameterisation to the optimization parameter vector.
74+
```@example symbolic_ude
75+
using Optimization
76+
oprob_base = ODEProblem(xy_model_ude, u0, (0.0, tend))
77+
set_ps = ModelingToolkit.setp_oop(oprob_base, [d, θ...])
78+
loss_params = (oprob_base, set_ps, sample_t, sample_X, sample_Y)
79+
ps_init = oprob_base.ps[[d, θ...]]
80+
of = OptimizationFunction{true}(loss, AutoForwardDiff())
81+
opt_prob = OptimizationProblem(of, ps_init, loss_params)
82+
```
83+
84+
Finally, we can fit the UDE to our data. We will use the Adam optimizer.
85+
```@example symbolic_ude
86+
import OptimizationOptimisers: Adam
87+
@time opt_sol = solve(opt_prob, Adam(0.01); maxiters = 10000)
88+
```
89+
90+
By plotting a simulation from our fitted UDE, we can confirm that it can reproduce the ground-truth model.
91+
```@example symbolic_ude
92+
oprob_fitted = remake(oprob_base; p = set_ps(oprob_base, opt_sol.u))
93+
sol_fitted = solve(oprob_fitted)
94+
plot!(sol_true; lw = 4, la = 0.7, linestyle = :dash, idxs = [X, Y], color = [:blue :red],
95+
label = ["X (UDE)" "Y (UDE)"])
96+
```

0 commit comments

Comments
 (0)