Skip to content

Commit 7658f9b

Browse files
Merge remote-tracking branch 'origin' into daenew
2 parents 7385194 + 1cd41ca commit 7658f9b

File tree

1 file changed

+55
-0
lines changed
  • docs/src/optimization_packages

1 file changed

+55
-0
lines changed
Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
# OptimizationODE.jl
2+
3+
**OptimizationODE.jl** provides ODE-based optimization methods as a solver plugin for [SciML's Optimization.jl](https://github.com/SciML/Optimization.jl). It wraps various ODE solvers to perform gradient-based optimization using continuous-time dynamics.
4+
5+
## Installation
6+
7+
```julia
8+
using Pkg
9+
Pkg.add(url="OptimizationODE.jl")
10+
```
11+
12+
## Usage
13+
14+
```julia
15+
using OptimizationODE, Optimization, ADTypes, SciMLBase
16+
17+
function f(x, p)
18+
return sum(abs2, x)
19+
end
20+
21+
function g!(g, x, p)
22+
@. g = 2 * x
23+
end
24+
25+
x0 = [2.0, -3.0]
26+
p = []
27+
28+
f_manual = OptimizationFunction(f, SciMLBase.NoAD(); grad = g!)
29+
prob_manual = OptimizationProblem(f_manual, x0)
30+
31+
opt = ODEGradientDescent(dt=0.01)
32+
sol = solve(prob_manual, opt; maxiters=50_000)
33+
34+
@show sol.u
35+
@show sol.objective
36+
```
37+
38+
## Local Gradient-based Optimizers
39+
40+
All provided optimizers are **gradient-based local optimizers** that solve optimization problems by integrating gradient-based ODEs to convergence:
41+
42+
* `ODEGradientDescent(dt=...)` — performs basic gradient descent using the explicit Euler method. This is a simple and efficient method suitable for small-scale or well-conditioned problems.
43+
44+
* `RKChebyshevDescent()` — uses the ROCK2 solver, a stabilized explicit Runge-Kutta method suitable for stiff problems. It allows larger step sizes while maintaining stability.
45+
46+
* `RKAccelerated()` — leverages the Tsit5 method, a 5th-order Runge-Kutta solver that achieves faster convergence for smooth problems by improving integration accuracy.
47+
48+
* `HighOrderDescent()` — applies Vern7, a high-order (7th-order) explicit Runge-Kutta method for even more accurate integration. This can be beneficial for problems requiring high precision.
49+
50+
You can also define a custom optimizer using the generic `ODEOptimizer(solver; dt=nothing)` constructor by supplying any ODE solver supported by [OrdinaryDiffEq.jl](https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/).
51+
52+
## Interface Details
53+
54+
All optimizers require gradient information (either via automatic differentiation or manually provided `grad!`). The optimization is performed by integrating the ODE defined by the negative gradient until a steady state is reached.
55+

0 commit comments

Comments
 (0)