You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can use DiffEqFlux.jl to define, train and output the densities computed by CNF layers. In the same way as a neural ODE, the layer takes a neural network that defines its derivative function (see [1] for a reference). A possible way to define a CNF layer, would be:
59
59
60
-
```julia
60
+
```@example cnf2
61
61
using Flux, DiffEqFlux, DifferentialEquations, Optimization, OptimizationFlux,
62
62
OptimizationOptimJL, Distributions
63
63
@@ -74,16 +74,17 @@ where we also pass as an input the desired timespan for which the differential e
74
74
75
75
### Training
76
76
77
-
First, let's get an array from a normal distribution as the training data
77
+
First, let's get an array from a normal distribution as the training data. Note that we want the data in Float32
78
+
values to match how we have setup the neural network weights and the state space of the ODE.
78
79
79
-
```julia
80
+
```@example cnf2
80
81
data_dist = Normal(6.0f0, 0.7f0)
81
-
train_data =rand(data_dist, 1, 100)
82
+
train_data = Float32.(rand(data_dist, 1, 100))
82
83
```
83
84
84
85
Now we define a loss function that we wish to minimize
85
86
86
-
```julia
87
+
```@example cnf2
87
88
function loss(θ)
88
89
logpx, λ₁, λ₂ = ffjord_mdl(train_data, θ)
89
90
-mean(logpx)
@@ -96,7 +97,7 @@ We then train the neural network to learn the distribution of `x`.
96
97
97
98
Here we showcase starting the optimization with `ADAM` to more quickly find a minimum, and then honing in on the minimum by using `LBFGS`.
For evaluating the result, we can use `totalvariation` function from `Distances.jl`. First, we compute densities using actual distribution and FFJORD model. then we use a distance function.
0 commit comments