You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/optimization_packages/ode.md
+1-8Lines changed: 1 addition & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,7 +35,7 @@ sol = solve(prob_manual, opt; maxiters=50_000)
35
35
@show sol.objective
36
36
```
37
37
38
-
## Available Optimizers
38
+
## Local-gradient based Optimizers
39
39
40
40
All provided optimizers are **gradient-based local optimizers** that solve optimization problems by integrating gradient-based ODEs to convergence:
41
41
@@ -53,10 +53,3 @@ You can also define a custom optimizer using the generic `ODEOptimizer(solver; d
53
53
54
54
All optimizers require gradient information (either via automatic differentiation or manually provided `grad!`). The optimization is performed by integrating the ODE defined by the negative gradient until a steady state is reached.
55
55
56
-
### Keyword Arguments
57
-
58
-
*`dt` — time step size (only for `ODEGradientDescent`).
0 commit comments