-
Notifications
You must be signed in to change notification settings - Fork 6
Description
Contact emails
[email protected], [email protected], [email protected]
Project summary
Learning mapping between function spaces with neural operators
Project description
NeuralOperator is a comprehensive library for learning neural operators in PyTorch. It is the official implementation for Fourier Neural Operators and Tensorized Neural Operators.
Unlike regular neural networks, neural operators enable learning mapping between function spaces, and this library provides all of the tools to do so on your own data.
Neural operators are also resolution invariant, so your trained operator can be applied on data of any resolution.
Are there any other projects in the PyTorch Ecosystem similar to yours? If, yes, what are they?
N/A.
Project repo URL
https://github.com/neuraloperator/neuraloperator
Additional repos in scope of the application
No response
Project license
MIT
GitHub handles of the project maintainer(s)
JeanKossaifi, dhpitt, vduruiss, kovachki, zongyi-li, animakumar
Is there a corporate or academic entity backing this project? If so, please provide the name and URL of the entity.
No response
Website URL
https://neuraloperator.github.io/dev/index.html
Documentation
Interactive examples are available at https://neuraloperator.github.io/dev/auto_examples/index.html
User guide: https://neuraloperator.github.io/dev/user_guide/index.html
Theory guide on neural operators: https://neuraloperator.github.io/dev/theory_guide/index.html
How do you build and test the project today (continuous integration)? Please describe.
Unit-tests, run on push and PR via Github actions, using pytest.
Version of PyTorch
We maintain backward compatibility quite far but always aim to stay on the cutting-edge of PyTorch version to benefit from the latest improvements and features.
Components of PyTorch
All the machinery (nn module, but also linear algebra, Fourier transform and features, etc). We build a class of neural architectures that map between function spaces and can access inputs at any discretization, for both regular grids and unstructured grids, so we need to build a lot of the machinery from the base components in PyTorch.
How long do you expect to maintain the project?
We have been maintaining it for several years and plan to continue doing so in the future, particularly as the community and user groups grow.
Additional information
No response