Skip to content

Commit 7181d7c

Browse files
author
Pedro Paulo
committed
Update 11/03
1 parent f3384ae commit 7181d7c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

class12/class12.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ When putting into a computer we are going to need to mesh our function, otherwis
3333
- In case of regression, the output **has to** a fixed dimension, the need of different dimension leads to a new NN and a new training.
3434
For the case of image processing, where there's no trivial underlying function behind the image, we cannot take advantage of the use of Neural Operators, but in the case of distributions of physical quantities, e.g., temperature, where there's a underlying function behind it, we can leverage the use of Neural Operators to understand distribution function, and make predictions/controls based on it, decoupling the parametrization $\Theta$ from the discretization of the data. \[cite] *et al.* compared the errors of two networks: U-Net (NN topology) and PCA-Net (Neural operator topology), that were trained on different discretizations of the *same underlying function*, and the result is shown below:
3535

36-
![Alt text](Figures/diagram.png)
36+
![Alt text](Figures/unetvspca.png)
3737

3838
This brings a concept (that we'll try to keep with our definition of Neural Operators) called **Discretization Invariance**:
3939
- When we have Discretization Invariance we de-couple the parameters and the cost from the discretization, i.e., when changing the discretization the error doesn't vary.

0 commit comments

Comments
 (0)