Numerical solution of the Pythagorean three-body problem using:
- Classical numerical methods (Adams-Bashforth 4, LSODA)
- Neural network approaches (3 different RNN strategies)
This project investigates whether neural networks can match or exceed classical numerical methods for solving chaotic differential equations.
03_Pthreebody_ode/
├── numerical/ # Classical numerical methods
│ ├── solver_ab4.py # Adams-Bashforth 4th order
│ ├── solver_lsoda.py # LSODA reference (SciPy)
│ ├── physics.py # Physics equations
│ ├── compare.py # Compare numerical methods
│ └── animate.py # Visualize solutions
│
├── rnn_learn_physics/ # RNN Version 1: Learn f(t,y)
│ ├── solver_rnn.py # Neural network learns physics function
│ ├── train_rnn.py # Training script (~50k parameters)
│ ├── animate_rnn.py # Visualization
│ └── physics.py # Physics (ground truth for training)
│
├── rnn_coeffs/ # RNN Version 2: Learn AB4 coefficients
│ ├── solver_rnn_coeffs.py # Learn 4 AB4 coefficients
│ ├── train_rnn_coeffs.py # Training script (4 parameters)
│ ├── animate_rnn_coeffs.py # Visualization
│ └── physics.py # Physics (analytical, not learned)
│
├── rnn_theoretical/ # RNN Version 3: Theoretical + perturbations
│ ├── solver_rnn_theoretical.py # Tiny corrections to theory
│ ├── train_rnn_theoretical.py # Training script
│ ├── animate_rnn_theoretical.py # Visualization
│ ├── solver_ab4.py # Standard AB4
│ ├── solver_lsoda.py # LSODA reference
│ └── physics.py # Physics (analytical)
│
├── old/ # Previous experiments
├── requirements.txt # Python dependencies
└── README.md # This file
Pythagorean Three-Body Problem:
- Three masses: m₁=3, m₂=4, m₃=5 (Pythagorean triple)
- Initial positions: (1,3), (-2,-1), (1,-1)
- Initial velocities: all zero
- Gravitational constant: G=1
Differential Equation:
dy/dt = f(t, y)
where y = [x₁, y₁, x₂, y₂, x₃, y₃, vx₁, vy₁, vx₂, vy₂, vx₃, vy₃]
Challenge: This is a chaotic system - small errors grow exponentially!
cd 03_Pthreebody_ode
pip install -r requirements.txtcd numerical
python compare.py # Compare AB4 vs LSODA
python animate.py # Animated visualizationVersion 1: Learn Physics f(t,y)
cd rnn_learn_physics
python train_rnn.py # Train (~40 min, ~50k params)
python animate_rnn.py # Visualize resultsVersion 2: Learn AB4 Coefficients
cd rnn_coeffs
python train_rnn_coeffs.py # Train (~2 min, 4 params)
python animate_rnn_coeffs.py # Visualize resultsVersion 3: Theoretical + Perturbations
cd rnn_theoretical
python train_rnn_theoretical.py # Train (~2 min, 4 params)
python animate_rnn_theoretical.py # Visualize results| Method | Energy Error (dE/E) | Parameters | Training Time | Stability |
|---|---|---|---|---|
| LSODA | ~1e-10 | N/A | N/A | ✅ Excellent |
| AB4 (standard) | ~1e-6 | N/A | N/A | ✅ Very good |
| RNN-Physics | ~1e+4 | ~50,000 | 40 min | ❌ Unstable |
| RNN-Coeffs | ~1e-3 to 1e-1 | 4 | 2 min | |
| RNN-Theoretical | ~1e-6 | 4 | 2 min | ✅ Good |
-
RNN-Physics (Version 1) fails catastrophically
- Despite 50,000 parameters and 40 minutes of training
- Cannot learn the complex gravitational physics accurately enough
- Small errors in f(t,y) accumulate over 5,000+ timesteps
- Conclusion: Learning physics from scratch doesn't work for chaotic systems
-
RNN-Coefficients (Version 2) struggles with stability
- Only 4 parameters to learn
- Without strong regularization: numerically unstable
- With strong regularization: similar to theoretical values
- Conclusion: Small deviations from theory can cause instability
-
RNN-Theoretical (Version 3) matches classical AB4
- Learns tiny corrections (~2%) to theoretical coefficients
- Remains stable and accurate
- Conclusion: Theory is already near-optimal for this problem
- Chaotic Dynamics: Small errors amplify exponentially
- Long Integration: 5,000+ timesteps accumulate errors
- Energy Conservation: Networks don't naturally preserve physical laws
- Function Complexity: Gravitational 1/r³ terms are hard to approximate
- Non-chaotic systems (e.g., damped oscillators)
- Short-term predictions (few timesteps)
- Systems with learned corrections to known physics
- Data-driven modeling where no theory exists
Formula:
y_{n+1} = y_n + (h/24) * (55*f_n - 59*f_{n-1} + 37*f_{n-2} - 9*f_{n-3})
Properties:
- 4th order accuracy: Error ~ O(h⁵)
- Explicit multistep method
- Requires 4 previous values (bootstrapped with RK4)
- Coefficients: [55/24, -59/24, 37/24, -9/24]
Approach 1: Learn f(t,y)
- Network approximates the physics function
- Input: state y (12D) + time t
- Output: derivative dy/dt (12D)
- Loss: MSE between predicted and true f
Approach 2: Learn Coefficients
- Network has 4 trainable parameters (AB4 coefficients)
- Physics f(t,y) remains analytical
- Loss: Energy drift during integration
- Inspired by pendulum implementation
Approach 3: Theoretical + Corrections
- Start with theoretical coefficients
- Learn small perturbations (ε ~ 0.02)
- Ensures numerical stability
- Loss: Energy conservation
This work addresses the question:
"Can neural networks provide better numerical solutions than classical methods for ODEs?"
For the Pythagorean three-body problem, the answer is:
No. Classical methods (AB4, LSODA) are superior because:
- They use exact physics
- They have proven stability properties
- They don't accumulate approximation errors
However, the investigation is valuable because:
- It demonstrates the limits of ML for chaotic systems
- It shows where physics-informed approaches are necessary
- It validates the importance of theoretical numerical analysis
Method dE/E at t=100 Interpretation
────────────────────────────────────────────────────
LSODA 10⁻¹⁰ Perfect (reference)
AB4 (standard) 10⁻⁶ Excellent
RNN-Theoretical 10⁻⁶ Excellent (≈ AB4)
RNN-Coeffs 10⁻³ - 10⁻¹ Acceptable to Poor
RNN-Physics 10⁺⁴ Catastrophic failure
Method Training Inference Total (t=100)
────────────────────────────────────────────────────────
LSODA None ~5s ~5s
AB4 None ~30s ~30s
RNN-Theoretical 2 min ~30s ~2.5 min
RNN-Coeffs 2 min ~30s ~2.5 min
RNN-Physics 40 min ~35s ~41 min
For production use: LSODA or AB4
For research: RNN-Theoretical (shows limits of learning)
Not recommended: RNN-Physics, RNN-Coeffs (unstable)
Adams-Bashforth 4 (AB4)
- File:
numerical/solver_ab4.py - Bootstrap: RK4 for first 3 steps
- Timestep: dt = 0.02
- Coefficients: [55/24, -59/24, 37/24, -9/24]
LSODA
- File:
numerical/solver_lsoda.py - Wrapper around
scipy.integrate.odeint - Adaptive timestep (automatic)
- Tolerances: rtol=1e-12, atol=1e-14
RNN-Physics (V1)
Input: state (12) + time (1) → 13D
Hidden: 128 → 256 → 256 → 128
Output: f(t,y) → 12D
Total params: ~50,000RNN-Coefficients (V2)
Trainable: 4 coefficients
Physics: analytical f(t,y)
Loss: energy_drift + regularization
Total params: 4RNN-Theoretical (V3)
Fixed: theoretical coefficients
Trainable: 4 perturbations (ε=0.02)
Actual: theoretical + ε * perturbations
Total params: 4- Adams, J.C. & Bashforth, F. (1883): On the numerical solution of differential equations
- Hornik et al. (1989): Universal Approximation Theorem for Neural Networks
- Hairer, Nørsett & Wanner (1993): Solving Ordinary Differential Equations I
- Raissi et al. (2019): Physics-informed neural networks (PINNs)
- Burrau, C. (1913): Numerische Berechnung eines Spezialfalles des Dreikörperproblems
See requirements.txt:
numpy>=1.24.0
scipy>=1.10.0
matplotlib>=3.7.0
torch>=2.0.0
Install with:
pip install -r requirements.txtphysics.py- Physics equations, initial conditions, energy/momentumsolver_ab4.py- Adams-Bashforth 4th order implementationsolver_lsoda.py- LSODA wrapper (SciPy)
solver_rnn.py- Learn f(t,y) with neural networksolver_rnn_coeffs.py- Learn AB4 coefficientssolver_rnn_theoretical.py- Theoretical + perturbations
train_rnn.py- Train RNN-Physics (V1)train_rnn_coeffs.py- Train RNN-Coefficients (V2)train_rnn_theoretical.py- Train RNN-Theoretical (V3)
animate.py- Animate numerical methodsanimate_rnn.py- Animate RNN-Physics vs referenceanimate_rnn_coeffs.py- Animate RNN-Coeffs vs referenceanimate_rnn_theoretical.py- Animate RNN-Theoretical vs referencecompare.py- Static comparison plots
# Run from main directory
cd numerical
python compare.py# Version 1: Learn Physics (~40 minutes)
cd rnn_learn_physics
python train_rnn.py
# Version 2: Learn Coefficients (~2 minutes)
cd ../rnn_coeffs
python train_rnn_coeffs.py
# Version 3: Theoretical + Perturbations (~2 minutes)
cd ../rnn_theoretical
python train_rnn_theoretical.py# Numerical methods
cd numerical
python animate.py
# RNN Version 1
cd rnn_learn_physics
python animate_rnn.py
# RNN Version 2
cd rnn_coeffs
python animate_rnn_coeffs.py
# RNN Version 3
cd rnn_theoretical
python animate_rnn_theoretical.pyProblem: Animation is slow/choppy
Solution: Reduce n_frames or increase interval in animate scripts
Problem: RNN training diverges (energy → ∞)
Solution: Increase regularization weight or decrease learning rate
Problem: "Model not found" error
Solution: Run the corresponding train_*.py script first
Problem: CUDA out of memory
Solution: Reduce batch_size in training scripts or use device='cpu'
Student Project - Studienarbeit (STA)
Pythagorean Three-Body Problem Analysis
Neural Networks for ODE Solving
See LICENSE file.
- SciPy team for LSODA implementation
- PyTorch team for deep learning framework
- Original three-body formulation by Burrau (1913)
- Pendulum RNN approach inspiration
This project demonstrates that:
- Classical methods are superior for chaotic ODEs
- Neural networks struggle with long-term integration
- Physics-informed constraints are essential for stability
- Theoretical analysis remains crucial in numerical computing
The investigation validates the importance of numerical analysis and shows that machine learning, while powerful, cannot replace sound mathematical foundations for solving chaotic differential equations.
Key Takeaway: For the three-body problem, stick with AB4 or LSODA. Use neural networks only when:
- No analytical solution exists
- Short-term predictions are sufficient
- You have physics-informed constraints