Exploring the intersection of classical numerical methods and deep learning for ODE systems
Can a neural network, trained on high-accuracy reference data, outperform a classical numerical method at comparable complexity?
This project investigates the numerical solution of ordinary differential equation (ODE) systems — not by conventional numerical solvers alone, but by leveraging neural networks.
The starting point is the Adams-Bashforth method [1], an explicit linear multistep method. Because its computation formula is linear, it is inherently well-suited for modeling with a neural network. The function evaluations required at each step are approximated by the network — a task neural networks are known to handle well, as guaranteed by the Universal Approximation Theorem [2].
For three ODE systems, we implement two parallel solution pipelines:
| Approach | Description |
|---|---|
| Adams-Bashforth (direct) | Classical numerical implementation of the linear multistep method |
| Neural Network | A network whose architecture mirrors the multistep formula, trained on high-accuracy reference data |
Training data is generated using the LSODA solver (available via SciPy's solve_ivp), which provides highly accurate reference solutions.
The goal is to compare both approaches in terms of accuracy against the reference data and to investigate:
💡 Whether, through suitable weight optimization, the neural network can achieve better results than the direct numerical approach at comparable computational complexity.
| Nr. | Problem | Description |
|---|---|---|
| 1 | 🕐 Mathematical Pendulum | Second-order oscillatory system with nonlinear restoring force |
| 2 | 🌀 Van der Pol Oscillator | Nonlinear oscillator with self-excited limit cycle behavior |
| 3 | 🪐 Pythagorean Three-Body Problem | Gravitational three-body system with Pythagorean initial conditions |
├── 01_pendulum_ode/ # Mathematical Pendulum
│ ├── adams_bashforth/ # → Direct AB method implementation
│ └── neural_network/ # → Neural network solution
│
├── 02_vanderpol_ode/ # Van der Pol Oscillator
│ ├── adams_bashforth/ # → Direct AB method implementation
│ └── neural_network/ # → Neural network solution
│
├── 03_Pthreebody_ode/ # Pythagorean Three-Body Problem
│ ├── adams_bashforth/ # → Direct AB method implementation
│ └── neural_network/ # → Neural network solution
│
├── LICENSE
└── README.md
The Adams-Bashforth method is an explicit linear multistep method for solving initial value problems. It uses previously computed function values from past time steps to extrapolate the next solution value. The general
where
The linear structure of the Adams-Bashforth formula is modeled as a neural network:
- Architecture: Mirrors the multistep formula — weights correspond to the AB coefficients
-
Function evaluations: The analytical function evaluations
$f(t, y)$ are approximated by the network -
Training data: Generated using SciPy's LSODA solver (
solve_ivp) for high-accuracy reference solutions - Objective: Learn optimal weights that potentially improve upon the fixed AB coefficients
┌─────────────────────────────────────────────────────┐
│ Neural Network │
│ │
│ y_{n}, y_{n-1}, ..., y_{n-k+1} │
│ │ │
│ ▼ │
│ ┌──────────────┐ ┌─────────────────────┐ │
│ │ Learned AB │ │ Function Approx. │ │
│ │ Coefficients│ + │ f(t, y) via NN │ │
│ └──────┬───────┘ └─────────┬───────────┘ │
│ │ │ │
│ └──────────┬───────────┘ │
│ ▼ │
│ y_{n+1} │
└─────────────────────────────────────────────────────┘
pip install numpy scipy tensorflow matplotlibEach problem directory contains self-contained scripts for both approaches. Navigate to the desired problem and run:
# Run the Adams-Bashforth implementation
python adams_bashforth/main.py
# Run the Neural Network implementation
python neural_network/main.pyResults for each ODE system — including accuracy comparisons, trajectory plots, and error analysis — can be found in the respective problem directories.
- [1]
- [2]
This project is licensed under the MIT License.
Made with ❤️ and math