|
| 1 | +# Symbolic UDE Creation |
| 2 | + |
| 3 | +This tutorial will demonstrate a simple interface for symbolic declaration neural networks that can be directly added to [ModelingToolkit.jl](https://github.com/SciML/ModelingToolkit.jl)-declared ODE models to create UDEs. The primarily functionality we show is the [`SymbolicNeuralNetwork`](@ref) function, however, we will show how it can be incorporated into a full workflow. For our example we will use a simple self-activation loop model, however, it can be easily generalised to more model types. |
| 4 | + |
| 5 | +### Ground truth model and synthetic data generation |
| 6 | + |
| 7 | +First we create the ground-truth model using ModelingToolkit. In it, `Y` activates `X` at the rate `v * (Y^n) / (K^n + Y^n)`. Later on, we will attempt to learn this rate using a neural network. Both variables decay at constant rates that scales with the parameter `d`. |
| 8 | + |
| 9 | +```@example symbolic_ude |
| 10 | +using ModelingToolkit |
| 11 | +using ModelingToolkit: t_nounits as t, D_nounits as D |
| 12 | +@variables X(t) Y(t) |
| 13 | +@parameters v=1.0 K=1.0 n=1.0 d=1.0 # Sets unused default values for all parameters (but vaguely useful as potential optimization initial conditions). |
| 14 | +eqs = [D(X) ~ v * (Y^n) / (K^n + Y^n) - d*X |
| 15 | + D(Y) ~ X - d*Y] |
| 16 | +@mtkcompile xy_model = System(eqs, t) |
| 17 | +``` |
| 18 | + |
| 19 | +Next, we simulate our model for a true parameter set (which we wish to recover). |
| 20 | + |
| 21 | +```@example symbolic_ude |
| 22 | +using OrdinaryDiffEqTsit5, Plots |
| 23 | +u0 = [X => 2.0, Y => 0.1] |
| 24 | +ps_true = [v => 1.1, K => 2.0, n => 3.0, d => 0.5] |
| 25 | +sim_cond = [u0; ps_true] |
| 26 | +tend = 45.0 |
| 27 | +oprob_true = ODEProblem(xy_model, sim_cond, (0.0, tend)) |
| 28 | +sol_true = solve(oprob_true, Tsit5()) |
| 29 | +plot(sol_true; lw = 6, idxs = [X, Y]) |
| 30 | +``` |
| 31 | + |
| 32 | +Finally, we generate noisy measured samples from both `X` and `Y` (to which we will fit the UDE). |
| 33 | + |
| 34 | +```@example symbolic_ude |
| 35 | +sample_t = range(0.0, tend; length = 20) |
| 36 | +sample_X = [(0.8 + 0.4rand()) * X_sample for X_sample in sol_true(sample_t; idxs = X)] |
| 37 | +sample_Y = [(0.8 + 0.4rand()) * Y_sample for Y_sample in sol_true(sample_t; idxs = Y)] |
| 38 | +plot!(sample_t, sample_X, seriestype = :scatter, |
| 39 | + label = "X (data)", color = 1, ms = 6, alpha = 0.7) |
| 40 | +plot!(sample_t, sample_Y, seriestype = :scatter, |
| 41 | + label = "Y (data)", color = 2, ms = 6, alpha = 0.7) |
| 42 | +``` |
| 43 | + |
| 44 | +### UDE declaration and training |
| 45 | + |
| 46 | +First, we use [Lux.jl](https://github.com/LuxDL/Lux.jl) to declare the neural network we wish to use for our UDE. For this case, we can use a fairly small network. We use `softplus` throughout the network we ensure that the fitted UDE function is positive (for our application this is the case, however, it might not always be true). |
| 47 | + |
| 48 | +```@example symbolic_ude |
| 49 | +using Lux |
| 50 | +nn_arch = Lux.Chain( |
| 51 | + Lux.Dense(1 => 3, Lux.softplus, use_bias = false), |
| 52 | + Lux.Dense(3 => 3, Lux.softplus, use_bias = false), |
| 53 | + Lux.Dense(3 => 1, Lux.softplus, use_bias = false) |
| 54 | +) |
| 55 | +``` |
| 56 | + |
| 57 | +Next, we can use [ModelingToolkitNeuralNets.jl](https://github.com/SciML/ModelingToolkitNeuralNets.jl) to turn our neural network to a Symbolic neural network representation (which can later be inserted into an ModelingToolkit model). |
| 58 | + |
| 59 | +```@example symbolic_ude |
| 60 | +using ModelingToolkitNeuralNets |
| 61 | +sym_nn, |
| 62 | +θ = SymbolicNeuralNetwork(; nn_p_name = :θ, chain = nn_arch, n_input = 1, n_output = 1) |
| 63 | +sym_nn_func(x) = sym_nn([x], θ)[1] |
| 64 | +``` |
| 65 | + |
| 66 | +Now we can create our UDE. We replace the (from now on unknown) function `v * (Y^n) / (K^n + Y^n)` with our symbolic neural network (which we let be a function of the variable `Y` only). |
| 67 | + |
| 68 | +```@example symbolic_ude |
| 69 | +eqs_ude = [D(X) ~ sym_nn_func(Y) - d*X |
| 70 | + D(Y) ~ X - d*Y] |
| 71 | +@mtkcompile xy_model_ude = System(eqs_ude, t) |
| 72 | +``` |
| 73 | + |
| 74 | +We can now fit our UDE model (including the neural network and the parameter d) to the data. First, we define a loss function which compares the UDE's simulation to the data. |
| 75 | + |
| 76 | +```@example symbolic_ude |
| 77 | +function loss(ps, (oprob_base, set_ps, sample_t, sample_X, sample_Y)) |
| 78 | + p = set_ps(oprob_base, ps) |
| 79 | + new_oprob = remake(oprob_base; p) |
| 80 | + new_osol = solve(new_oprob, Tsit5(); saveat = sample_t, verbose = false, maxiters = 10000) |
| 81 | + SciMLBase.successful_retcode(new_osol) || return Inf # Simulation failed -> Inf loss. |
| 82 | + x_error = sum((x_sim - x_data)^2 for (x_sim, x_data) in zip(new_osol[X], sample_X)) |
| 83 | + y_error = sum((y_sim - y_data)^2 for (y_sim, y_data) in zip(new_osol[Y], sample_Y)) |
| 84 | + return x_error + y_error |
| 85 | +end |
| 86 | +``` |
| 87 | + |
| 88 | +Next, we use [Optimization.jl](https://github.com/SciML/Optimization.jl) to create an `OptimizationProblem`. This uses a similar syntax to normal parameter inference workflows, however, we need to add the entire neural network parameterisation to the optimization parameter vector. |
| 89 | + |
| 90 | +```@example symbolic_ude |
| 91 | +using Optimization |
| 92 | +oprob_base = ODEProblem(xy_model_ude, u0, (0.0, tend)) |
| 93 | +set_ps = ModelingToolkit.setp_oop(oprob_base, [d, θ...]) |
| 94 | +loss_params = (oprob_base, set_ps, sample_t, sample_X, sample_Y) |
| 95 | +ps_init = oprob_base.ps[[d, θ...]] |
| 96 | +of = OptimizationFunction{true}(loss, AutoForwardDiff()) |
| 97 | +opt_prob = OptimizationProblem(of, ps_init, loss_params) |
| 98 | +``` |
| 99 | + |
| 100 | +Finally, we can fit the UDE to our data. We will use the Adam optimizer. |
| 101 | + |
| 102 | +```@example symbolic_ude |
| 103 | +import OptimizationOptimisers: Adam |
| 104 | +@time opt_sol = solve(opt_prob, Adam(0.01); maxiters = 10000) |
| 105 | +``` |
| 106 | + |
| 107 | +By plotting a simulation from our fitted UDE, we can confirm that it can reproduce the ground-truth model. |
| 108 | + |
| 109 | +```@example symbolic_ude |
| 110 | +oprob_fitted = remake(oprob_base; p = set_ps(oprob_base, opt_sol.u)) |
| 111 | +sol_fitted = solve(oprob_fitted, Tsit5()) |
| 112 | +plot!(sol_true; lw = 4, la = 0.7, linestyle = :dash, idxs = [X, Y], color = [:blue :red], |
| 113 | + label = ["X (UDE)" "Y (UDE)"]) |
| 114 | +``` |
0 commit comments