You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-`verbose`: Control output verbosity (overrides Ipopt's `print_level`)
56
74
-`false` or `0`: No output
57
75
-`true` or `5`: Standard output
58
-
- Integer values 0-12: Different verbosity levels (maps to `print_level`)
59
-
-`hessian_approximation`: Method for Hessian computation
60
-
-`"exact"` (default): Use exact Hessian
61
-
-`"limited-memory"`: Use L-BFGS approximation
76
+
- Integer values 0-12: Different verbosity levels
62
77
63
-
### Advanced Ipopt Options
78
+
### IpoptOptimizer Constructor Options
64
79
65
-
Any Ipopt option can be passed directly as keyword arguments. The full list of available options is documented in the [Ipopt Options Reference](https://coin-or.github.io/Ipopt/OPTIONS.html). Common options include:
80
+
Ipopt-specific options are passed to the `IpoptOptimizer` constructor. The most commonly used options are available as struct fields:
2.**Initial Points**: Provide good initial guesses when possible. Ipopt is a local optimizer and the solution quality depends on the starting point.
251
320
252
-
3.**Hessian Approximation**: For large problems or when Hessian computation is expensive, use `hessian_approximation = "limited-memory"`.
321
+
3.**Hessian Approximation**: For large problems or when Hessian computation is expensive, use `hessian_approximation = "limited-memory"` in the `IpoptOptimizer` constructor.
253
322
254
323
4.**Linear Solver Selection**: The choice of linear solver can significantly impact performance. For large problems, consider using HSL solvers (ma27, ma57, ma86, ma97). Note that HSL solvers require [separate installation](https://github.com/jump-dev/Ipopt.jl?tab=readme-ov-file#linear-solvers) - see the Ipopt.jl documentation for setup instructions. The default MUMPS solver works well for small to medium problems.
255
324
256
325
5.**Constraint Formulation**: Ipopt handles equality constraints well. When possible, formulate constraints as equalities rather than pairs of inequalities.
257
326
258
-
6.**Warm Starting**: When solving a sequence of similar problems, use the solution from the previous problem as the initial point for the next.
327
+
6.**Warm Starting**: When solving a sequence of similar problems, use the solution from the previous problem as the initial point for the next. You can enable warm starting with `IpoptOptimizer(warm_start_init_point = "yes")`.
**OptimizationODE.jl** provides ODE-based optimization methods as a solver plugin for [SciML's Optimization.jl](https://github.com/SciML/Optimization.jl). It wraps various ODE solvers to perform gradient-based optimization using continuous-time dynamics.
4
+
5
+
## Installation
6
+
7
+
```julia
8
+
using Pkg
9
+
Pkg.add("OptimizationODE")
10
+
```
11
+
12
+
## Usage
13
+
14
+
```julia
15
+
using OptimizationODE, Optimization, ADTypes, SciMLBase
16
+
17
+
functionf(x, p)
18
+
returnsum(abs2, x)
19
+
end
20
+
21
+
functiong!(g, x, p)
22
+
@. g =2* x
23
+
end
24
+
25
+
x0 = [2.0, -3.0]
26
+
p = []
27
+
28
+
f_manual =OptimizationFunction(f, SciMLBase.NoAD(); grad = g!)
29
+
prob_manual =OptimizationProblem(f_manual, x0)
30
+
31
+
opt =ODEGradientDescent(dt=0.01)
32
+
sol =solve(prob_manual, opt; maxiters=50_000)
33
+
34
+
@show sol.u
35
+
@show sol.objective
36
+
```
37
+
38
+
## Local Gradient-based Optimizers
39
+
40
+
All provided optimizers are **gradient-based local optimizers** that solve optimization problems by integrating gradient-based ODEs to convergence:
41
+
42
+
*`ODEGradientDescent(dt=...)` — performs basic gradient descent using the explicit Euler method. This is a simple and efficient method suitable for small-scale or well-conditioned problems.
43
+
44
+
*`RKChebyshevDescent()` — uses the ROCK2 solver, a stabilized explicit Runge-Kutta method suitable for stiff problems. It allows larger step sizes while maintaining stability.
45
+
46
+
*`RKAccelerated()` — leverages the Tsit5 method, a 5th-order Runge-Kutta solver that achieves faster convergence for smooth problems by improving integration accuracy.
47
+
48
+
*`HighOrderDescent()` — applies Vern7, a high-order (7th-order) explicit Runge-Kutta method for even more accurate integration. This can be beneficial for problems requiring high precision.
49
+
50
+
You can also define a custom optimizer using the generic `ODEOptimizer(solver; dt=nothing)` constructor by supplying any ODE solver supported by [OrdinaryDiffEq.jl](https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/).
51
+
52
+
## DAE-based Optimizers
53
+
54
+
!!! warn
55
+
DAE-based optimizers are still experimental and a research project. Use with caution.
56
+
57
+
In addition to ODE-based optimizers, OptimizationODE.jl provides optimizers for differential-algebraic equation (DAE) constrained problems:
58
+
59
+
*`DAEMassMatrix()` — uses the Rodas5P solver (from OrdinaryDiffEq.jl) for DAE problems with a mass matrix formulation.
60
+
61
+
*`DAEOptimizer(IDA())` — uses the IDA solver (from Sundials.jl) for DAE problems with index variable support (requires `using Sundials`)
62
+
63
+
You can also define a custom optimizer using the generic `ODEOptimizer(solver)` or `DAEOptimizer(solver)` constructor by supplying any ODE or DAE solver supported by [OrdinaryDiffEq.jl](https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/) or [Sundials.jl](https://github.com/SciML/Sundials.jl).
64
+
65
+
## Interface Details
66
+
67
+
All optimizers require gradient information (either via automatic differentiation or manually provided `grad!`). The optimization is performed by integrating the ODE defined by the negative gradient until a steady state is reached.
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) does not support box constraints. Either remove the `lb` or `ub` bounds passed to `OptimizationProblem` or use a different algorithm."))
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires box constraints. Either pass `lb` and `ub` bounds to `OptimizationProblem` or use a different algorithm."))
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) does not support constraints. Either remove the `cons` function passed to `OptimizationFunction` or use a different algorithm."))
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) does not support callbacks, remove the `callback` keyword argument from the `solve` call."))
49
+
SciMLBase.requiresgradient(alg) &&
50
+
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
51
+
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires gradients, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoForwardDiff())` or pass it in with `grad` kwarg."))
52
+
SciMLBase.requireshessian(alg) &&
53
+
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
54
+
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires hessians, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoFiniteDiff(); kwargs...)` or pass them in with `hess` kwarg."))
55
+
SciMLBase.requiresconsjac(alg) &&
56
+
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
57
+
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires constraint jacobians, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoFiniteDiff(); kwargs...)` or pass them in with `cons` kwarg."))
58
+
SciMLBase.requiresconshess(alg) &&
59
+
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
60
+
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires constraint hessians, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoFiniteDiff(), AutoFiniteDiff(hess=true); kwargs...)` or pass them in with `cons` kwarg."))
61
+
return
62
+
end
63
+
64
+
# Base solver dispatch functions (these will be extended by specific solver packages)
0 commit comments