Skip to content

Commit c019baa

Browse files
committed
add cross references
1 parent 5d7347a commit c019baa

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

docs/src/symbolic_ude_tutorial.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Symbolic UDE Creation
22

3-
This tutorial will demonstrate a simple interface for symbolic declaration neural networks that can be directly added to ModelingToolkit-declared ODE models to create UDEs. The primarily functionality we show is the `SymbolicNeuralNetwork` function, however, we will show how it can be incorporated into a full workflow. For our example we will use a simple self-activation loop model, however, it can be easily generalised to more model types.
3+
This tutorial will demonstrate a simple interface for symbolic declaration neural networks that can be directly added to [ModelingToolkit.jl](https://github.com/SciML/ModelingToolkit.jl)-declared ODE models to create UDEs. The primarily functionality we show is the [`SymbolicNeuralNetwork`](@ref) function, however, we will show how it can be incorporated into a full workflow. For our example we will use a simple self-activation loop model, however, it can be easily generalised to more model types.
44

55
### Ground truth model and synthetic data generation
66

@@ -39,7 +39,7 @@ plot!(sample_t, sample_Y, seriestype = :scatter, label = "Y (data)", color = 2,
3939
```
4040

4141
### UDE declaration and training
42-
First, we use Lux.jl to declare the neural network we wish to use for our UDE. For this case, we can use a fairly small network. We use `softplus` throughout the network we ensure that the fitted UDE function is positive (for our application this is the case, however, it might not always be true).
42+
First, we use [Lux.jl](https://github.com/LuxDL/Lux.jl) to declare the neural network we wish to use for our UDE. For this case, we can use a fairly small network. We use `softplus` throughout the network we ensure that the fitted UDE function is positive (for our application this is the case, however, it might not always be true).
4343
```@example symbolic_ude
4444
using Lux
4545
nn_arch = Lux.Chain(
@@ -49,7 +49,7 @@ nn_arch = Lux.Chain(
4949
)
5050
```
5151

52-
Next, we can use ModelingToolkitNeuralNets to turn our neural network to a Symbolic neural network representation (which can later be inserted into an ModelingToolkit model).
52+
Next, we can use [ModelingToolkitNeuralNets.jl](https://github.com/SciML/ModelingToolkitNeuralNets.jl) to turn our neural network to a Symbolic neural network representation (which can later be inserted into an ModelingToolkit model).
5353
```@example symbolic_ude
5454
using ModelingToolkitNeuralNets
5555
sym_nn, θ = SymbolicNeuralNetwork(; nn_p_name = :θ, chain = nn_arch, n_input = 1, n_output = 1)
@@ -78,7 +78,7 @@ function loss(ps, (oprob_base, set_ps, sample_t, sample_X, sample_Y))
7878
end
7979
```
8080

81-
Next, we use Optimization.jl to create an `OptimizationProblem`. This uses a similar syntax to normal parameter inference workflows, however, we need to add the entire neural network parameterisation to the optimization parameter vector.
81+
Next, we use [Optimization.jl](https://github.com/SciML/Optimization.jl) to create an `OptimizationProblem`. This uses a similar syntax to normal parameter inference workflows, however, we need to add the entire neural network parameterisation to the optimization parameter vector.
8282
```@example symbolic_ude
8383
using Optimization
8484
oprob_base = ODEProblem(xy_model_ude, u0, (0.0, tend))

0 commit comments

Comments
 (0)