|
1 | 1 | # Modeling Optimization Problems
|
2 | 2 |
|
3 |
| -## Rosenbrock Function in 2D |
| 3 | +ModelingToolkit.jl is not only useful for generating initial value problems (`ODEProblem`). |
| 4 | +The package can also build optimization systems. |
4 | 5 |
|
5 |
| -Let's solve the classical _Rosenbrock function_ in two dimensions. |
| 6 | +!!! note |
| 7 | + |
| 8 | + The high level `@mtkmodel` macro used in the |
| 9 | + [getting started tutorial](@ref getting_started) |
| 10 | + is not yet compatible with `OptimizationSystem`. |
| 11 | + We thus have to use a lower level interface to define optimization systems. |
| 12 | + For an introduction to this interface, read the |
| 13 | + [programmatically generating ODESystems tutorial](@ref programmatically). |
6 | 14 |
|
7 |
| -First, we need to make some imports. |
| 15 | +## Unconstrained Rosenbrock Function |
8 | 16 |
|
9 |
| -```julia |
10 |
| -using ModelingToolkit, Optimization, OptimizationOptimJL |
11 |
| -``` |
12 |
| - |
13 |
| -Now we can define our optimization problem. |
| 17 | +Let's optimize the classical _Rosenbrock function_ in two dimensions. |
14 | 18 |
|
15 |
| -```julia |
| 19 | +```@example optimization |
| 20 | +using ModelingToolkit, Optimization, OptimizationOptimJL |
16 | 21 | @variables begin
|
17 |
| - x, [bounds = (-2.0, 2.0)] |
18 |
| - y, [bounds = (-1.0, 3.0)] |
| 22 | + x = 1.0, [bounds = (-2.0, 2.0)] |
| 23 | + y = 3.0, [bounds = (-1.0, 3.0)] |
19 | 24 | end
|
20 |
| -@parameters a=1 b=1 |
21 |
| -loss = (a - x)^2 + b * (y - x^2)^2 |
22 |
| -@mtkbuild sys = OptimizationSystem(loss, [x, y], [a, b]) |
| 25 | +@parameters a=1.0 b=1.0 |
| 26 | +rosenbrock = (a - x)^2 + b * (y - x^2)^2 |
| 27 | +@mtkbuild sys = OptimizationSystem(rosenbrock, [x, y], [a, b]) |
23 | 28 | ```
|
24 | 29 |
|
25 |
| -A visualization of the objective function is depicted below. |
| 30 | +Every optimization problem consists of a set of optimization variables. |
| 31 | +In this case, we create two variables: `x` and `y`, |
| 32 | +with initial guesses `1` and `3` for their optimal values. |
| 33 | +Additionally, we assign box constraints for each of them, using `bounds`, |
| 34 | +Bounds is an example of symbolic metadata. |
| 35 | +Fore more information, take a look at the symbolic metadata |
| 36 | +[documentation page](@ref symbolic_metadata). |
| 37 | + |
| 38 | +We also create two parameters with `@parameters`. |
| 39 | +Parameters are useful if you want to solve the same optimization problem multiple times, |
| 40 | +with different values for these parameters. |
| 41 | +Default values for these parameters can also be assigned, here `1` is used for both `a` and `b`. |
| 42 | +These optimization values and parameters are used in an objective function, here the Rosenbrock function. |
| 43 | + |
| 44 | +Next, the actual `OptimizationProblem` can be created. |
| 45 | +The initial guesses for the optimization variables can be overwritten, via an array of `Pairs`, |
| 46 | +in the second argument of `OptimizationProblem`. |
| 47 | +Values for the parameters of the system can also be overwritten from their default values, |
| 48 | +in the third argument of `OptimizationProblem`. |
| 49 | +ModelingToolkit is also capable of constructing analytical gradients and Hessians of the objective function. |
| 50 | + |
| 51 | +```@example optimization |
| 52 | +u0 = [y => 2.0] |
| 53 | +p = [b => 100.0] |
26 | 54 |
|
27 |
| -```@eval |
28 |
| -using Plots |
29 |
| -x = -2:0.01:2 |
30 |
| -y = -1:0.01:3 |
31 |
| -contour(x, y, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis, |
32 |
| - ratio = :equal, xlims = (-2, 2)) |
33 |
| -savefig("obj_fun.png"); |
34 |
| -nothing; # hide |
| 55 | +prob = OptimizationProblem(sys, u0, p, grad = true, hess = true) |
| 56 | +u_opt = solve(prob, GradientDescent()) |
35 | 57 | ```
|
36 | 58 |
|
37 |
| - |
38 |
| - |
39 |
| -### Explanation |
40 |
| - |
41 |
| -Every optimization problem consists of a set of _optimization variables_. In this case, we create two variables. Additionally, we assign _box constraints_ for each of them. In the next step, we create two parameters for the problem with `@parameters`. While it is not needed to do this, it makes it easier to `remake` the problem later with different values for the parameters. The _objective function_ is specified as well and finally, everything is used to construct an `OptimizationSystem`. |
42 |
| - |
43 |
| -## Building and Solving the Optimization Problem |
| 59 | +A visualization of the Rosenbrock function is depicted below. |
44 | 60 |
|
45 |
| -Next, the actual `OptimizationProblem` can be created. At this stage, an initial guess `u0` for the optimization variables needs to be provided via map, using the symbols from before. Concrete values for the parameters of the system can also be provided or changed. However, if the parameters have default values assigned, they are used automatically. |
46 |
| - |
47 |
| -```julia |
48 |
| -u0 = [x => 1.0 |
49 |
| - y => 2.0] |
50 |
| -p = [a => 1.0 |
51 |
| - b => 100.0] |
52 |
| - |
53 |
| -prob = OptimizationProblem(sys, u0, p, grad = true, hess = true) |
54 |
| -solve(prob, GradientDescent()) |
| 61 | +```@example optimization |
| 62 | +using Plots |
| 63 | +x_plot = -2:0.01:2 |
| 64 | +y_plot = -1:0.01:3 |
| 65 | +contour( |
| 66 | + x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis, |
| 67 | + ratio = :equal, xlims = (-2, 2)) |
| 68 | +scatter!([u_opt[1]], [u_opt[2]], ms = 10, label = "minimum") |
55 | 69 | ```
|
56 | 70 |
|
57 | 71 | ## Rosenbrock Function with Constraints
|
58 | 72 |
|
59 |
| -```julia |
| 73 | +ModelingToolkit is also capable of handing more complicated constraints than box constraints. |
| 74 | +Non-linear equality and inequality constraints can be added to the `OptimizationSystem`. |
| 75 | +Let's add an inequality constraint to the previous example: |
| 76 | + |
| 77 | +```@example optimization_constrained |
60 | 78 | using ModelingToolkit, Optimization, OptimizationOptimJL
|
61 | 79 |
|
62 | 80 | @variables begin
|
63 |
| - x, [bounds = (-2.0, 2.0)] |
64 |
| - y, [bounds = (-1.0, 3.0)] |
| 81 | + x = 0.14, [bounds = (-2.0, 2.0)] |
| 82 | + y = 0.14, [bounds = (-1.0, 3.0)] |
65 | 83 | end
|
66 |
| -@parameters a=1 b=100 |
67 |
| -loss = (a - x)^2 + b * (y - x^2)^2 |
| 84 | +@parameters a=1.0 b=100.0 |
| 85 | +rosenbrock = (a - x)^2 + b * (y - x^2)^2 |
68 | 86 | cons = [
|
69 | 87 | x^2 + y^2 ≲ 1
|
70 | 88 | ]
|
71 |
| -@mtkbuild sys = OptimizationSystem(loss, [x, y], [a, b], constraints = cons) |
72 |
| -u0 = [x => 0.14 |
73 |
| - y => 0.14] |
74 |
| -prob = OptimizationProblem(sys, |
75 |
| - u0, |
76 |
| - grad = true, |
77 |
| - hess = true, |
78 |
| - cons_j = true, |
79 |
| - cons_h = true) |
80 |
| -solve(prob, IPNewton()) |
| 89 | +@mtkbuild sys = OptimizationSystem(rosenbrock, [x, y], [a, b], constraints = cons) |
| 90 | +prob = OptimizationProblem(sys, [], grad = true, hess = true, cons_j = true, cons_h = true) |
| 91 | +u_opt = solve(prob, IPNewton()) |
81 | 92 | ```
|
82 | 93 |
|
83 |
| -A visualization of the objective function and the inequality constraint is depicted below. |
| 94 | +Inequality constraints are constructed via a `≲` (or `≳`). |
| 95 | +[(To write these symbols in your own code write `\lesssim` or `\gtrsim` and then press tab.)] |
| 96 | +(https://docs.julialang.org/en/v1/manual/unicode-input/) |
| 97 | +An equality constraint can be specified via a `~`, e.g., `x^2 + y^2 ~ 1`. |
| 98 | + |
| 99 | +A visualization of the Rosenbrock function and the inequality constraint is depicted below. |
84 | 100 |
|
85 |
| -```@eval |
| 101 | +```@example optimization_constrained |
86 | 102 | using Plots
|
87 |
| -x = -2:0.01:2 |
88 |
| -y = -1:0.01:3 |
89 |
| -contour(x, y, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis, |
| 103 | +x_plot = -2:0.01:2 |
| 104 | +y_plot = -1:0.01:3 |
| 105 | +contour( |
| 106 | + x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis, |
90 | 107 | ratio = :equal, xlims = (-2, 2))
|
91 |
| -contour!(x, y, (x, y) -> x^2 + y^2, levels = [1], color = :lightblue, line = 4) |
92 |
| -savefig("obj_fun_c.png"); |
93 |
| -nothing; # hide |
| 108 | +contour!(x_plot, y_plot, (x, y) -> x^2 + y^2, levels = [1], color = :lightblue, line = 4) |
| 109 | +scatter!([u_opt[1]], [u_opt[2]], ms = 10, label = "minimum") |
94 | 110 | ```
|
95 | 111 |
|
96 |
| - |
97 |
| - |
98 |
| -### Explanation |
99 |
| - |
100 |
| -Equality and inequality constraints can be added to the `OptimizationSystem`. An equality constraint can be specified via an `Equation`, e.g., `x^2 + y^2 ~ 1`. While inequality constraints via an `Inequality`, e.g., `x^2 + y^2 ≲ 1`. The syntax is here `\lesssim` and `\gtrsim`. |
101 |
| - |
102 | 112 | ## Nested Systems
|
103 | 113 |
|
104 | 114 | Needs more text, but it's super cool and auto-parallelizes and sparsifies too.
|
|
0 commit comments