You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/symbolic_ude_tutorial.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# Symbolic UDE Creation
2
2
3
-
This tutorial will demonstrate a simple interface for symbolic declaration neural networks that can be directly added to ModelingToolkit-declared ODE models to create UDEs. The primarily functionality we show is the `SymbolicNeuralNetwork` function, however, we will show how it can be incorporated into a full workflow. For our example we will use a simple self-activation loop model, however, it can be easily generalised to more model types.
3
+
This tutorial will demonstrate a simple interface for symbolic declaration neural networks that can be directly added to [ModelingToolkit.jl](https://github.com/SciML/ModelingToolkit.jl)-declared ODE models to create UDEs. The primarily functionality we show is the [`SymbolicNeuralNetwork`](@ref) function, however, we will show how it can be incorporated into a full workflow. For our example we will use a simple self-activation loop model, however, it can be easily generalised to more model types.
4
4
5
5
### Ground truth model and synthetic data generation
6
6
@@ -39,7 +39,7 @@ plot!(sample_t, sample_Y, seriestype = :scatter, label = "Y (data)", color = 2,
39
39
```
40
40
41
41
### UDE declaration and training
42
-
First, we use Lux.jl to declare the neural network we wish to use for our UDE. For this case, we can use a fairly small network. We use `softplus` throughout the network we ensure that the fitted UDE function is positive (for our application this is the case, however, it might not always be true).
42
+
First, we use [Lux.jl](https://github.com/LuxDL/Lux.jl) to declare the neural network we wish to use for our UDE. For this case, we can use a fairly small network. We use `softplus` throughout the network we ensure that the fitted UDE function is positive (for our application this is the case, however, it might not always be true).
43
43
```@example symbolic_ude
44
44
using Lux
45
45
nn_arch = Lux.Chain(
@@ -49,7 +49,7 @@ nn_arch = Lux.Chain(
49
49
)
50
50
```
51
51
52
-
Next, we can use ModelingToolkitNeuralNets to turn our neural network to a Symbolic neural network representation (which can later be inserted into an ModelingToolkit model).
52
+
Next, we can use [ModelingToolkitNeuralNets.jl](https://github.com/SciML/ModelingToolkitNeuralNets.jl) to turn our neural network to a Symbolic neural network representation (which can later be inserted into an ModelingToolkit model).
Next, we use Optimization.jl to create an `OptimizationProblem`. This uses a similar syntax to normal parameter inference workflows, however, we need to add the entire neural network parameterisation to the optimization parameter vector.
81
+
Next, we use [Optimization.jl](https://github.com/SciML/Optimization.jl) to create an `OptimizationProblem`. This uses a similar syntax to normal parameter inference workflows, however, we need to add the entire neural network parameterisation to the optimization parameter vector.
0 commit comments