Skip to content
This repository was archived by the owner on Sep 28, 2024. It is now read-only.

Commit 001d1bd

Browse files
committed
Update link
1 parent 9f4dee1 commit 001d1bd

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

docs/src/introduction.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,24 +8,24 @@ an operator mapping from infinite-dimensional space ``(u, t)`` to infinite-dimen
88
Neural operator learns a continuous function between two continuous function spaces.
99
The kernel can be trained on different geometry, including regular Euclidean space or a graph topology.
1010

11-
## [Fourier Neural Operators](https://github.com/SciML/NeuralOperators.jl/blob/master/src/model.jl)
11+
## [Fourier Neural Operators](https://github.com/SciML/NeuralOperators.jl/blob/main/src/FNO/FNO.jl)
1212

1313
Fourier neural operator (FNO) learns a neural operator with Dirichlet kernel to form a Fourier transformation.
1414
It performs Fourier transformation across infinite-dimensional function spaces and learns better than neural operator.
1515

16-
## [Markov Neural Operators](https://github.com/SciML/NeuralOperators.jl/blob/master/src/model.jl)
16+
## [Markov Neural Operators](https://github.com/SciML/NeuralOperators.jl/blob/main/src/FNO/FNO.jl)
1717

1818
Markov neural operator (MNO) learns a neural operator with Fourier operators.
1919
With only one time step information of learning, it can predict the following few steps with low loss
2020
by linking the operators into a Markov chain.
2121

22-
## [Deep Operator Network](https://github.com/SciML/NeuralOperators.jl/blob/master/src/DeepONet.jl)
22+
## [Deep Operator Network](https://github.com/SciML/NeuralOperators.jl/blob/main/src/DeepONet/DeepONet.jl)
2323

2424
Deep operator network (DeepONet) learns a neural operator with the help of two sub-neural network structures described as the branch and the trunk network.
2525
The branch network is fed the initial conditions data, whereas the trunk is fed with the locations where the target(output) is evaluated from the corresponding initial conditions.
2626
It is important that the output size of the branch and trunk subnets is same so that a dot product can be performed between them.
2727

28-
## [Nonlinear Manifold Decoders for Operator Learning](https://github.com/SciML/NeuralOperators.jl/blob/master/src/NOMAD.jl)
28+
## [Nonlinear Manifold Decoders for Operator Learning](https://github.com/SciML/NeuralOperators.jl/blob/main/src/NOMAD/NOMAD.jl)
2929

3030
Nonlinear Manifold Decoders for Operator Learning (NOMAD) learns a neural operator with a nonlinear decoder parameterized by a deep neural network which jointly takes output of approximator and the locations as parameters.
3131
The approximator network is fed with the initial conditions data. The output-of-approximator and the locations are then passed to a decoder neural network to get the target (output). It is important that the input size of the decoder subnet is sum of size of the output-of-approximator and number of locations.

0 commit comments

Comments
 (0)