Skip to content
Open
Changes from 3 commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
2646d29
MOO Docs updated blackboxoptim.md
ParasPuneetSingh Sep 7, 2024
79a297b
Update evolutionary.md
ParasPuneetSingh Sep 20, 2024
1394ec4
Update metaheuristics.md
ParasPuneetSingh Sep 20, 2024
5af0e2b
Update docs/src/optimization_packages/blackboxoptim.md
Vaibhavdixit02 Sep 22, 2024
37aa52b
Update Project.toml
ParasPuneetSingh Sep 22, 2024
81f5e8a
Update Project.toml
ParasPuneetSingh Sep 22, 2024
238dcbe
Update docs/src/optimization_packages/blackboxoptim.md
Vaibhavdixit02 Sep 22, 2024
9198cb4
Update metaheuristics.md
ParasPuneetSingh Sep 23, 2024
be6388b
Update Project.toml
ParasPuneetSingh Sep 23, 2024
5b3a0b5
Update blackboxoptim.md
ParasPuneetSingh Sep 23, 2024
6a1dceb
Update evolutionary.md
ParasPuneetSingh Sep 25, 2024
c9d892f
Update docs/src/optimization_packages/metaheuristics.md
Vaibhavdixit02 Sep 27, 2024
564f53a
Update docs/src/optimization_packages/evolutionary.md
Vaibhavdixit02 Oct 26, 2024
b1fe7da
Update metaheuristics.md
ParasPuneetSingh Nov 9, 2024
89abf73
Update Project.toml
ParasPuneetSingh Nov 9, 2024
5aa1289
Update metaheuristics.md
ParasPuneetSingh Nov 9, 2024
fb0181b
Update blackboxoptim.md
ParasPuneetSingh Nov 9, 2024
6d4c6ba
Update metaheuristics.md
ParasPuneetSingh Nov 10, 2024
d8fea39
Update OptimizationBBO.jl
ParasPuneetSingh Jan 29, 2025
032e5b9
Update runtests.jl
ParasPuneetSingh Jan 29, 2025
a6ae41e
Create ode.md
ParasPuneetSingh Jun 2, 2025
99d06ef
Update docs/src/optimization_packages/ode.md
ParasPuneetSingh Jun 2, 2025
b7e2927
Update ode.md
ParasPuneetSingh Jun 2, 2025
36be3e8
Update ode.md
ParasPuneetSingh Jun 22, 2025
1cd41ca
Update ode.md
ParasPuneetSingh Jun 22, 2025
66919e4
DAE based solvers
ParasPuneetSingh Jul 10, 2025
ebe2aec
MOO tests and code updates
ParasPuneetSingh Jul 10, 2025
0cb0e9c
Merge branch 'master' of https://github.com/ParasPuneetSingh/Optimiza…
ParasPuneetSingh Jul 17, 2025
d308614
Merge branch 'SciML:master' into master
ParasPuneetSingh Jul 17, 2025
7fbca50
import changes
ParasPuneetSingh Jul 20, 2025
129389e
Merge branch 'master' of https://github.com/ParasPuneetSingh/Optimiza…
ParasPuneetSingh Jul 20, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 62 additions & 0 deletions docs/src/optimization_packages/ode.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
# OptimizationODE.jl

**OptimizationODE.jl** provides ODE-based optimization methods as a solver plugin for [SciML's Optimization.jl](https://github.com/SciML/Optimization.jl). It wraps various ODE solvers to perform gradient-based optimization using continuous-time dynamics.

## Installation

```julia
using Pkg
Pkg.add(url="OptimizationODE.jl")
```

## Usage

```julia
using OptimizationODE, Optimization, ADTypes, SciMLBase

function f(x, p)
return sum(abs2, x)
end

function g!(g, x, p)
@. g = 2 * x
end

x0 = [2.0, -3.0]
p = []

f_manual = OptimizationFunction(f, SciMLBase.NoAD(); grad = g!)
prob_manual = OptimizationProblem(f_manual, x0)

opt = ODEGradientDescent(dt=0.01)
sol = solve(prob_manual, opt; maxiters=50_000)

@show sol.u
@show sol.objective
```

## Available Optimizers

All provided optimizers are **gradient-based local optimizers** that solve optimization problems by integrating gradient-based ODEs to convergence:

* `ODEGradientDescent(dt=...)` — performs basic gradient descent using the explicit Euler method. This is a simple and efficient method suitable for small-scale or well-conditioned problems.

* `RKChebyshevDescent()` — uses the ROCK2 solver, a stabilized explicit Runge-Kutta method suitable for stiff problems. It allows larger step sizes while maintaining stability.

* `RKAccelerated()` — leverages the Tsit5 method, a 5th-order Runge-Kutta solver that achieves faster convergence for smooth problems by improving integration accuracy.

* `HighOrderDescent()` — applies Vern7, a high-order (7th-order) explicit Runge-Kutta method for even more accurate integration. This can be beneficial for problems requiring high precision.

You can also define a custom optimizer using the generic `ODEOptimizer(solver; dt=nothing)` constructor by supplying any ODE solver supported by [OrdinaryDiffEq.jl](https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/).

## Interface Details

All optimizers require gradient information (either via automatic differentiation or manually provided `grad!`). The optimization is performed by integrating the ODE defined by the negative gradient until a steady state is reached.

### Keyword Arguments

* `dt` — time step size (only for `ODEGradientDescent`).
* `maxiters` — maximum number of ODE steps.
* `callback` — function to observe progress.
* `progress=true` — enables live progress display.

Loading