You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<span id="BetckeEtAl2024">M. M. Betcke, L. M. Kreusser, and D. Murari, “Parallel-in-Time Solutions with Random Projection Neural Networks,” arXiv:2408.09756v1 [math.NA], 2024 [Online]. Available at: <a href="http://arxiv.org/abs/2408.09756v1" target="_blank">http://arxiv.org/abs/2408.09756v1</a></span>
This paper considers one of the fundamental parallel-in-time methods for the solution of ordinary differential equations, Parareal, and extends it by adopting a neural network as a coarse propagator. We provide a theoretical analysis of the convergence properties of the proposed algorithm and show its effectiveness for several examples, including Lorenz and Burgers’ equations. In our numerical simulations, we further specialize the underpinning neural architecture to Random Projection Neural Networks (RPNNs), a 2-layer neural network where the first layer weights are drawn at random rather than optimized. This restriction substantially increases the efficiency of fitting RPNN’s weights in comparison to a standard feedforward network without negatively impacting the accuracy, as demonstrated in the SIR system example.
0 commit comments