Skip to content

Commit 27614b1

Browse files
committed
Clean up SymbolicNeuralNetwork docs
1 parent 6bdf82a commit 27614b1

File tree

1 file changed

+17
-5
lines changed

1 file changed

+17
-5
lines changed

docs/src/symbolic_ude_tutorial.md

Lines changed: 17 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ Next, we can use [ModelingToolkitNeuralNets.jl](https://github.com/SciML/Modelin
6060
using ModelingToolkitNeuralNets
6161
sym_nn,
6262
θ = SymbolicNeuralNetwork(; nn_p_name = :θ, chain = nn_arch, n_input = 1, n_output = 1)
63-
sym_nn_func(x) = sym_nn([x], θ)[1]
63+
sym_nn_func(x) = sym_nn(x, θ)[1]
6464
```
6565

6666
Now we can create our UDE. We replace the (from now on unknown) function `v * (Y^n) / (K^n + Y^n)` with our symbolic neural network (which we let be a function of the variable `Y` only).
@@ -77,7 +77,7 @@ We can now fit our UDE model (including the neural network and the parameter d)
7777
function loss(ps, (oprob_base, set_ps, sample_t, sample_X, sample_Y))
7878
p = set_ps(oprob_base, ps)
7979
new_oprob = remake(oprob_base; p)
80-
new_osol = solve(new_oprob, Tsit5(); saveat = sample_t, verbose = false, maxiters = 10000)
80+
new_osol = solve(new_oprob, Tsit5(); saveat = sample_t, verbose = false)
8181
SciMLBase.successful_retcode(new_osol) || return Inf # Simulation failed -> Inf loss.
8282
x_error = sum((x_sim - x_data)^2 for (x_sim, x_data) in zip(new_osol[X], sample_X))
8383
y_error = sum((y_sim - y_data)^2 for (y_sim, y_data) in zip(new_osol[Y], sample_Y))
@@ -89,11 +89,12 @@ Next, we use [Optimization.jl](https://github.com/SciML/Optimization.jl) to crea
8989

9090
```@example symbolic_ude
9191
using Optimization
92+
using SymbolicIndexingInterface: setp_oop
9293
oprob_base = ODEProblem(xy_model_ude, u0, (0.0, tend))
93-
set_ps = ModelingToolkit.setp_oop(oprob_base, [d, θ...])
94+
set_ps = setp_oop(oprob_base, [d; θ])
9495
loss_params = (oprob_base, set_ps, sample_t, sample_X, sample_Y)
95-
ps_init = oprob_base.ps[[d, θ...]]
96-
of = OptimizationFunction{true}(loss, AutoForwardDiff())
96+
ps_init = oprob_base.ps[[d; θ]]
97+
of = OptimizationFunction(loss, AutoForwardDiff())
9798
opt_prob = OptimizationProblem(of, ps_init, loss_params)
9899
```
99100

@@ -112,3 +113,14 @@ sol_fitted = solve(oprob_fitted, Tsit5())
112113
plot!(sol_true; lw = 4, la = 0.7, linestyle = :dash, idxs = [X, Y], color = [:blue :red],
113114
label = ["X (UDE)" "Y (UDE)"])
114115
```
116+
117+
We can also inspect how the function described by the neural network looks like and how does it compare
118+
to the known correct function
119+
```@example symbolic_ude
120+
true_func(y) = 1.1 * (y^3) / (2^3 + y^3)
121+
fitted_func(y) = oprob_fitted.ps[sym_nn](y, oprob_fitted.ps[θ])[1]
122+
123+
# Plots the true and fitted functions (we mostly got the correct one, but less accurate in some regions).
124+
plot(true_func, 0.0, 5.0; lw=8, label="True function", color=:lightblue)
125+
plot!(fitted_func, 0.0, 5.0; lw=6, label="Fitted function", color=:blue, la=0.7, linestyle=:dash)
126+
```

0 commit comments

Comments
 (0)