Skip to content
This repository was archived by the owner on Sep 28, 2024. It is now read-only.

Commit ca71475

Browse files
authored
Merge pull request #13 from yuehhua/doc
Improve documentation and README
2 parents 0c30786 + e7abd60 commit ca71475

File tree

3 files changed

+122
-28
lines changed

3 files changed

+122
-28
lines changed

README.md

Lines changed: 78 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,5 @@
11
# NeuralOperators
22

3-
## Source code status
4-
53
| **Documentation** | **Build Status** |
64
|:-----------------:|:----------------:|
75
| [![doc dev badge]][doc dev link] | [![ci badge]][ci link] [![codecov badge]][codecov link] |
@@ -14,11 +12,82 @@
1412
[codecov badge]: https://codecov.io/gh/foldfelis/NeuralOperators.jl/branch/master/graph/badge.svg?token=JQH3MP1Y9R
1513
[codecov link]: https://codecov.io/gh/foldfelis/NeuralOperators.jl
1614

17-
[Neural Operator](https://github.com/zongyi-li/graph-pde) is a novel deep learning method to learned the mapping
18-
between infinite-dimensional spaces of functions introduced by [Zongyi Li](https://github.com/zongyi-li) et al.
15+
Neural operator is a novel deep learning architecture. It learns a operator, which is a mapping
16+
between infinite-dimensional function spaces. It can be used to resolve [partial differential equations (PDE)](https://en.wikipedia.org/wiki/Partial_differential_equation).
17+
Instead of solving by finite element method, a PDE problem can be resolved by learning a neural network to learn an operator
18+
mapping from infinite-dimensional space (u, t) to infinite-dimensional space f(u, t). Neural operator learns a continuous function
19+
between two continuous function spaces. The kernel can be trained on different geometry, which is learned from a graph.
20+
21+
Fourier neural operator learns a neural operator with Dirichlet kernel to form a Fourier transformation. It performs Fourier transformation across infinite-dimensional function spaces and learns better than neural operator.
22+
23+
Currently, `FourierOperator` is provided in this work.
24+
25+
## Usage
26+
27+
```
28+
function FourierNeuralOperator()
29+
modes = (16, )
30+
ch = 64 => 64
31+
σ = gelu
32+
33+
return Chain(
34+
# project finite-dimensional data to infinite-dimensional space
35+
Dense(2, 64),
36+
# operator projects data between infinite-dimensional spaces
37+
FourierOperator(ch, modes, σ),
38+
FourierOperator(ch, modes, σ),
39+
FourierOperator(ch, modes, σ),
40+
FourierOperator(ch, modes),
41+
# project infinite-dimensional function to finite-dimensional space
42+
Dense(64, 128, σ),
43+
Dense(128, 1),
44+
flatten
45+
)
46+
end
47+
```
48+
49+
Or you can just call:
50+
51+
```
52+
fno = FourierNeuralOperator()
53+
```
54+
55+
And then train as a Flux model.
56+
57+
```
58+
loss(𝐱, 𝐲) = sum(abs2, 𝐲 .- fno(𝐱)) / size(𝐱)[end]
59+
opt = Flux.Optimiser(WeightDecay(1f-4), Flux.ADAM(1f-3))
60+
Flux.@epochs 50 Flux.train!(loss, params(m), data, opt)
61+
```
62+
63+
## Examples
64+
65+
PDE training examples are provided in `example` folder.
66+
67+
### One-dimensional Burgers' equation
68+
69+
[Burgers' equation](https://en.wikipedia.org/wiki/Burgers%27_equation) example can be found in `example/burgers.jl`.
70+
71+
### Two-dimensional Darcy flow equation
72+
73+
WIP
74+
75+
### Two-dimensional Navier-Stokes equation
76+
77+
WIP
78+
79+
## Roadmap
80+
81+
- [x] `FourierOperator` layer
82+
- [x] One-dimensional Burgers' equation example
83+
- [ ] Two-dimensional Darcy flow equation example
84+
- [ ] Two-dimensional Navier-Stokes equation example
85+
- [ ] `NeuralOperator` layer
86+
- [ ] Poisson equation example
87+
88+
## References
1989

20-
In this project I temporarily provide the SpectralConv layer and the
21-
[Fourier Neural Operator](https://github.com/zongyi-li/fourier_neural_operator).
22-
For more information, please take a look at the
23-
[Fourier Neural Operator model](src/model.jl) and the [example](example/burgers.jl) of solving
24-
[Burgers' equation](https://www.wikiwand.com/en/Burgers%27_equation)
90+
- [Fourier Neural Operator for Parametric Partial Differential Equations](https://arxiv.org/abs/2010.08895)
91+
- [zongyi-li/fourier_neural_operator](https://github.com/zongyi-li/fourier_neural_operator)
92+
- [Neural Operator: Graph Kernel Network for Partial Differential Equations](https://arxiv.org/abs/2003.03485)
93+
- [zongyi-li/graph-pde](https://github.com/zongyi-li/graph-pde)

docs/src/index.md

Lines changed: 32 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,36 @@ Documentation for [NeuralOperators](https://github.com/foldfelis/NeuralOperators
99
```@index
1010
```
1111

12-
```@autodocs
13-
Modules = [NeuralOperators]
12+
## Layers
13+
14+
### Spectral convolutional layer
15+
16+
```math
17+
F(s) = \mathcal{F} \{ v(x) \} \\
18+
F'(s) = g(F(s)) \\
19+
v'(x) = \mathcal{F}^{-1} \{ F'(s) \}
20+
```
21+
22+
where ``v(x)`` and ``v'(x)`` denotes input and output function, ``\mathcal{F} \{ \cdot \}``, ``\mathcal{F}^{-1} \{ \cdot \}`` are Fourier transform, inverse Fourier transform, respectively. Function ``g`` is a linear transform for lowering Fouier modes.
23+
24+
```@docs
25+
SpectralConv
26+
```
27+
28+
Reference: [Fourier Neural Operator for Parametric Partial Differential Equations](https://arxiv.org/abs/2010.08895)
29+
30+
---
31+
32+
### Fourier operator layer
33+
34+
```math
35+
v_{t+1}(x) = \sigma(W v_t(x) + \mathcal{K} \{ v_t(x) \} )
1436
```
37+
38+
where ``v_t(x)`` is the input function for ``t``-th layer and ``\mathcal{K} \{ \cdot \}`` denotes spectral convolutional layer. Activation function ``\sigma`` can be arbitrary non-linear function.
39+
40+
```@docs
41+
FourierOperator
42+
```
43+
44+
Reference: [Fourier Neural Operator for Parametric Partial Differential Equations](https://arxiv.org/abs/2010.08895)

src/fourier.jl

Lines changed: 12 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -27,13 +27,13 @@ end
2727
init=c_glorot_uniform, permuted=false, T=ComplexF32
2828
)
2929
30-
## SpectralConv
30+
## Arguments
3131
32-
* ``v(x)``: input
33-
* ``F``, ``F^{-1}``: Fourier transform, inverse fourier transform
34-
* ``L``: linear transform on the lower Fouier modes.
35-
36-
``v(x)`` -> ``F`` -> ``L`` -> ``F^{-1}``
32+
* `ch`: Input and output channel size, e.g. `64=>64`.
33+
* `modes`: The Fourier modes to be preserved.
34+
* `σ`: Activation function.
35+
* `permuted`: Whether the dim is permuted. If `permuted=true`, layer accepts
36+
data in the order of `(..., ch, batch)`, otherwise the order is `(ch, ..., batch)`.
3737
3838
## Example
3939
@@ -109,18 +109,13 @@ end
109109
"""
110110
FourierOperator(ch, modes, σ=identity; permuted=false)
111111
112-
## FourierOperator
113-
114-
* ``v(x)``: input
115-
* ``F``, ``F^{-1}``: Fourier transform, inverse fourier transform
116-
* ``L``: linear transform on the lower Fouier modes
117-
* ``D``: local linear transform
112+
## Arguments
118113
119-
```
120-
┌ F -> L -> F¯¹ ┐
121-
v(x) -> ┤ ├ -> + -> σ
122-
└ D ┘
123-
```
114+
* `ch`: Input and output channel size for spectral convolution, e.g. `64=>64`.
115+
* `modes`: The Fourier modes to be preserved for spectral convolution.
116+
* `σ`: Activation function.
117+
* `permuted`: Whether the dim is permuted. If `permuted=true`, layer accepts
118+
data in the order of `(..., ch, batch)`, otherwise the order is `(ch, ..., batch)`.
124119
125120
## Example
126121

0 commit comments

Comments
 (0)