-
-
Notifications
You must be signed in to change notification settings - Fork 98
Create ode.md #924
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
ParasPuneetSingh
wants to merge
31
commits into
SciML:master
Choose a base branch
from
ParasPuneetSingh:master
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Create ode.md #924
Changes from 3 commits
Commits
Show all changes
31 commits
Select commit
Hold shift + click to select a range
2646d29
MOO Docs updated blackboxoptim.md
ParasPuneetSingh 79a297b
Update evolutionary.md
ParasPuneetSingh 1394ec4
Update metaheuristics.md
ParasPuneetSingh 5af0e2b
Update docs/src/optimization_packages/blackboxoptim.md
Vaibhavdixit02 37aa52b
Update Project.toml
ParasPuneetSingh 81f5e8a
Update Project.toml
ParasPuneetSingh 238dcbe
Update docs/src/optimization_packages/blackboxoptim.md
Vaibhavdixit02 9198cb4
Update metaheuristics.md
ParasPuneetSingh be6388b
Update Project.toml
ParasPuneetSingh 5b3a0b5
Update blackboxoptim.md
ParasPuneetSingh 6a1dceb
Update evolutionary.md
ParasPuneetSingh c9d892f
Update docs/src/optimization_packages/metaheuristics.md
Vaibhavdixit02 564f53a
Update docs/src/optimization_packages/evolutionary.md
Vaibhavdixit02 b1fe7da
Update metaheuristics.md
ParasPuneetSingh 89abf73
Update Project.toml
ParasPuneetSingh 5aa1289
Update metaheuristics.md
ParasPuneetSingh fb0181b
Update blackboxoptim.md
ParasPuneetSingh 6d4c6ba
Update metaheuristics.md
ParasPuneetSingh d8fea39
Update OptimizationBBO.jl
ParasPuneetSingh 032e5b9
Update runtests.jl
ParasPuneetSingh a6ae41e
Create ode.md
ParasPuneetSingh 99d06ef
Update docs/src/optimization_packages/ode.md
ParasPuneetSingh b7e2927
Update ode.md
ParasPuneetSingh 36be3e8
Update ode.md
ParasPuneetSingh 1cd41ca
Update ode.md
ParasPuneetSingh 66919e4
DAE based solvers
ParasPuneetSingh ebe2aec
MOO tests and code updates
ParasPuneetSingh 0cb0e9c
Merge branch 'master' of https://github.com/ParasPuneetSingh/Optimiza…
ParasPuneetSingh d308614
Merge branch 'SciML:master' into master
ParasPuneetSingh 7fbca50
import changes
ParasPuneetSingh 129389e
Merge branch 'master' of https://github.com/ParasPuneetSingh/Optimiza…
ParasPuneetSingh File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
# OptimizationODE.jl | ||
|
||
**OptimizationODE.jl** provides ODE-based optimization methods as a solver plugin for [SciML's Optimization.jl](https://github.com/SciML/Optimization.jl). It wraps various ODE solvers to perform gradient-based optimization using continuous-time dynamics. | ||
|
||
## Installation | ||
|
||
```julia | ||
using Pkg | ||
Pkg.add(url="OptimizationODE.jl") | ||
``` | ||
|
||
## Usage | ||
|
||
```julia | ||
using OptimizationODE, Optimization, ADTypes, SciMLBase | ||
|
||
function f(x, p) | ||
return sum(abs2, x) | ||
end | ||
|
||
function g!(g, x, p) | ||
@. g = 2 * x | ||
end | ||
|
||
x0 = [2.0, -3.0] | ||
p = [] | ||
|
||
f_manual = OptimizationFunction(f, SciMLBase.NoAD(); grad = g!) | ||
prob_manual = OptimizationProblem(f_manual, x0) | ||
|
||
opt = ODEGradientDescent(dt=0.01) | ||
sol = solve(prob_manual, opt; maxiters=50_000) | ||
|
||
@show sol.u | ||
@show sol.objective | ||
``` | ||
|
||
## Available Optimizers | ||
|
||
All provided optimizers are **gradient-based local optimizers** that solve optimization problems by integrating gradient-based ODEs to convergence: | ||
|
||
* `ODEGradientDescent(dt=...)` — performs basic gradient descent using the explicit Euler method. This is a simple and efficient method suitable for small-scale or well-conditioned problems. | ||
|
||
* `RKChebyshevDescent()` — uses the ROCK2 solver, a stabilized explicit Runge-Kutta method suitable for stiff problems. It allows larger step sizes while maintaining stability. | ||
|
||
* `RKAccelerated()` — leverages the Tsit5 method, a 5th-order Runge-Kutta solver that achieves faster convergence for smooth problems by improving integration accuracy. | ||
|
||
* `HighOrderDescent()` — applies Vern7, a high-order (7th-order) explicit Runge-Kutta method for even more accurate integration. This can be beneficial for problems requiring high precision. | ||
|
||
You can also define a custom optimizer using the generic `ODEOptimizer(solver; dt=nothing)` constructor by supplying any ODE solver supported by [OrdinaryDiffEq.jl](https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/). | ||
|
||
## Interface Details | ||
|
||
All optimizers require gradient information (either via automatic differentiation or manually provided `grad!`). The optimization is performed by integrating the ODE defined by the negative gradient until a steady state is reached. | ||
|
||
### Keyword Arguments | ||
|
||
* `dt` — time step size (only for `ODEGradientDescent`). | ||
* `maxiters` — maximum number of ODE steps. | ||
* `callback` — function to observe progress. | ||
* `progress=true` — enables live progress display. | ||
|
||
ChrisRackauckas marked this conversation as resolved.
Show resolved
Hide resolved
|
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.