🚀 HyperNOs: Hyperparameter Optimization for Neural Operators #2053
MaxGhi8
started this conversation in
Show and tell
Replies: 1 comment
-
|
Sounds great. Maybe you can add some examples in DeepXDE? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi DeepXDE community,
I wanted to share an open-source project I’ve been working on: HyperNOs, a library focused on hyperparameter optimization for neural operators, now integrating examples from DeepXDE. I implemented several hyperparameter optimization ideas and adapted them to work with DeepXDE architectures, including practical examples and end-to-end usage tutorials. In particular, there are examples for DeepONet, POD-DeepONet, MIOnet, POD-MIOnet architectures.
🔗 Project links
Code: https://github.com/MaxGhi8/HyperNOs
Paper: https://link.springer.com/article/10.1007/s40574-025-00516-0
The library is currently maintained mainly by me, so there’s certainly room for improvement. I would love to hear any feedback, suggestions, or observations, and I’m happy to explore collaborations with anyone interested in experimenting with neural operators in DeepXDE.
—
Massimiliano Ghiotto
Beta Was this translation helpful? Give feedback.
All reactions