You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/examples/min_and_max.md
+27-37Lines changed: 27 additions & 37 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,9 @@
2
2
3
3
### Setup
4
4
5
-
In this tutorial we will show how to use Optim.jl to find the maxima and minima of solutions. Let's take a look at the double pendulum:
5
+
In this tutorial we will show how to use
6
+
[Optimization.jl](https://docs.sciml.ai/Optimization/stable/) to find the maxima and minima
7
+
of solutions. Let's take a look at the double pendulum:
6
8
7
9
```@example minmax
8
10
#Constants and setup
@@ -52,25 +54,31 @@ Let's fine out what some of the local maxima and minima are. Optim.jl can be use
52
54
f = (t) -> sol(t,idxs=4)
53
55
```
54
56
55
-
`first(t)` is the same as `t[1]` which transforms the array of size 1 into a number. `idxs=4` is the same as `sol(first(t))[4]` but does the calculation without a temporary array and thus is faster. To find a local minima, we can simply call Optim on this function. Let's find a local minimum:
57
+
`first(t)` is the same as `t[1]` which transforms the array of size 1 into a number. `idxs=4` is the same as `sol(first(t))[4]` but does the calculation without a temporary array and thus is faster. To find a local minima, we can solve the optimization problem where the loss
58
+
function is `f`:
56
59
57
60
```@example minmax
58
-
using Optim
59
-
opt = optimize(f,18.0,22.0)
61
+
using Optimization, OptimizationNLopt
62
+
optf = OptimizationFunction(f, AutoForwardDiff())
63
+
min_guess = 18.0
64
+
optprob = OptimizationProblem(optf, min_guess)
65
+
opt = solve(optprob, NLopt.LD_LBFGS())
60
66
```
61
67
62
-
From this printout we see that the minimum is at `t=18.63` and the value is `-2.79e-2`. We can get these in code-form via:
68
+
From this printout we see that the minimum is at `t=18.63` and the value is `-2.79e-2`. We
69
+
can get these in code-form via:
63
70
64
71
```@example minmax
65
-
println(opt.minimizer)
66
-
println(opt.minimum)
72
+
println(opt.u)
67
73
```
68
74
69
75
To get the maximum, we just minimize the negative of the function:
Brent's method will locally minimize over the full interval. If we instead want a local maxima nearest to a point, we can use `BFGS()`. In this case, we need to optimize a vector `[t]`, and thus dereference it to a number using `first(t)`.
85
-
86
-
```@example minmax
87
-
f = (t) -> -sol(first(t),idxs=4)
88
-
opt = optimize(f,[20.0],BFGS())
89
-
```
90
-
91
92
### Global Optimization
92
93
93
-
If we instead want to find global maxima and minima, we need to look somewhere else. For this there are many choices. A pure Julia option is BlackBoxOptim.jl, but I will use NLopt.jl. Following the NLopt.jl tutorial but replacing their function with out own:
94
+
If we instead want to find global maxima and minima, we need to look somewhere else. For
95
+
this there are many choices. A pure Julia option are the
96
+
[BlackBoxOptim solvers within Optimization.jl](https://docs.sciml.ai/Optimization/stable/optimization_packages/blackboxoptim/),
97
+
but I will continue the story with the OptimizationNLopt methods. To do this, we simply
98
+
swap out to one of the
99
+
[global optimizers in the list](https://docs.sciml.ai/Optimization/stable/optimization_packages/nlopt/)
0 commit comments