Skip to content

Commit 7cd30a9

Browse files
committed
rewrite optimization tutorial
1 parent 2bef546 commit 7cd30a9

File tree

1 file changed

+73
-62
lines changed

1 file changed

+73
-62
lines changed

docs/src/tutorials/optimization.md

Lines changed: 73 additions & 62 deletions
Original file line numberDiff line numberDiff line change
@@ -1,104 +1,115 @@
11
# Modeling Optimization Problems
22

3-
## Rosenbrock Function in 2D
3+
ModelingToolkit.jl is not only useful for generating initial value problems (`ODEProblem`).
4+
The package can also build optimization systems.
45

5-
Let's solve the classical _Rosenbrock function_ in two dimensions.
6+
!!! note
7+
8+
The high level `@mtkmodel` macro used in the
9+
[getting started tutorial](@ref getting_started)
10+
is not yet compatible with `OptimizationSystem`.
11+
We thus have to use a lower level interface to define optimization systems.
12+
For an introduction to this interface, read the
13+
[programmatically generating ODESystems tutorial](@ref programmatically).
614

7-
First, we need to make some imports.
15+
## Unconstrained Rosenbrock Function
816

9-
```julia
10-
using ModelingToolkit, Optimization, OptimizationOptimJL
11-
```
17+
Let's optimize the classical _Rosenbrock function_ in two dimensions.
1218

13-
Now we can define our optimization problem.
14-
15-
```julia
19+
```@example optimization
20+
using ModelingToolkit, Optimization, OptimizationOptimJL
1621
@variables begin
17-
x, [bounds = (-2.0, 2.0)]
18-
y, [bounds = (-1.0, 3.0)]
22+
x, [bounds = (-2.0, 2.0), guess = 1.0]
23+
y, [bounds = (-1.0, 3.0), guess = 3.0]
1924
end
2025
@parameters a=1 b=1
21-
loss = (a - x)^2 + b * (y - x^2)^2
22-
@mtkbuild sys = OptimizationSystem(loss, [x, y], [a, b])
26+
rosenbrock = (a - x)^2 + b * (y - x^2)^2
27+
@mtkbuild sys = OptimizationSystem(rosenbrock, [x, y], [a, b])
2328
```
2429

25-
A visualization of the objective function is depicted below.
30+
Every optimization problem consists of a set of optimization variables.
31+
In this case, we create two variables: `x` and `y`.
32+
Additionally, we assign box constraints for each of them, using `bounds`,
33+
as well as an initial guess for their optimal values, using `guess`.
34+
Both bounds and guess are called symbolic metadata.
35+
Fore more information, take a look at the symbolic metadata
36+
[documentation page](symbolic_metadata).
2637

27-
```@eval
38+
We also create two parameters with `@parameters`.
39+
Parameters are useful if you want to solve the same optimization problem multiple times,
40+
with different values for these parameters.
41+
Default values for these parameters can also be assigned, here `1` is used for both `a` and `b`.
42+
These optimization values and parameters are used in an objective function, here the Rosenbrock function.
43+
44+
A visualization of the Rosenbrock function is depicted below.
45+
46+
```@example optimization
2847
using Plots
29-
x = -2:0.01:2
30-
y = -1:0.01:3
31-
contour(x, y, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
48+
x_plot = -2:0.01:2
49+
y_plot = -1:0.01:3
50+
contour(
51+
x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
3252
ratio = :equal, xlims = (-2, 2))
33-
savefig("obj_fun.png");
34-
nothing; # hide
3553
```
3654

37-
![plot of the Rosenbrock function](obj_fun.png)
38-
39-
### Explanation
40-
41-
Every optimization problem consists of a set of _optimization variables_. In this case, we create two variables. Additionally, we assign _box constraints_ for each of them. In the next step, we create two parameters for the problem with `@parameters`. While it is not needed to do this, it makes it easier to `remake` the problem later with different values for the parameters. The _objective function_ is specified as well and finally, everything is used to construct an `OptimizationSystem`.
55+
Next, the actual `OptimizationProblem` can be created.
56+
The initial guesses for the optimization variables can be overwritten, via an array of `Pairs`,
57+
in the second argument of `OptimizationProblem`.
58+
Values for the parameters of the system can also be overwritten from their default values,
59+
in the third argument of `OptimizationProblem`.
60+
ModelingToolkit is also capable of constructing analytical gradients and Hessians of the objective function.
4261

43-
## Building and Solving the Optimization Problem
44-
45-
Next, the actual `OptimizationProblem` can be created. At this stage, an initial guess `u0` for the optimization variables needs to be provided via map, using the symbols from before. Concrete values for the parameters of the system can also be provided or changed. However, if the parameters have default values assigned, they are used automatically.
46-
47-
```julia
48-
u0 = [x => 1.0
49-
y => 2.0]
50-
p = [a => 1.0
51-
b => 100.0]
62+
```@example optimization
63+
u0 = [y => 2.0]
64+
p = [b => 100.0]
5265
5366
prob = OptimizationProblem(sys, u0, p, grad = true, hess = true)
5467
solve(prob, GradientDescent())
5568
```
5669

70+
We see that the optimization result corresponds to the minimum in the figure.
71+
5772
## Rosenbrock Function with Constraints
5873

59-
```julia
74+
ModelingToolkit is also capable of handing more complicated constraints than box constraints.
75+
Non-linear equality and inequality constraints can be added to the `OptimizationSystem`.
76+
Let's add an inequality constraint to the previous example:
77+
78+
```@example optimization_constrained
6079
using ModelingToolkit, Optimization, OptimizationOptimJL
6180
6281
@variables begin
63-
x, [bounds = (-2.0, 2.0)]
64-
y, [bounds = (-1.0, 3.0)]
82+
x, [bounds = (-2.0, 2.0), guess = 1.0]
83+
y, [bounds = (-1.0, 3.0), guess = 2.0]
6584
end
6685
@parameters a=1 b=100
67-
loss = (a - x)^2 + b * (y - x^2)^2
86+
rosenbrock = (a - x)^2 + b * (y - x^2)^2
6887
cons = [
6988
x^2 + y^2 ≲ 1
7089
]
71-
@mtkbuild sys = OptimizationSystem(loss, [x, y], [a, b], constraints = cons)
72-
u0 = [x => 0.14
73-
y => 0.14]
74-
prob = OptimizationProblem(sys,
75-
u0,
76-
grad = true,
77-
hess = true,
78-
cons_j = true,
79-
cons_h = true)
80-
solve(prob, IPNewton())
90+
@mtkbuild sys = OptimizationSystem(rosenbrock, [x, y], [a, b], constraints = cons)
91+
prob = OptimizationProblem(sys, [], grad = true, hess = true, cons_j = true, cons_h = true)
92+
u_opt = solve(prob, IPNewton())
8193
```
8294

83-
A visualization of the objective function and the inequality constraint is depicted below.
95+
Inequality constraints are constructed via a `` (or ``).
96+
[(To write these symbols in your own code write `\lesssim` or `\gtrsim` and then press tab.)]
97+
(https://docs.julialang.org/en/v1/manual/unicode-input/)
98+
An equality constraint can be specified via a `~`, e.g., `x^2 + y^2 ~ 1`.
8499

85-
```@eval
100+
A visualization of the Rosenbrock function and the inequality constraint is depicted below.
101+
102+
```@example optimization_constrained
86103
using Plots
87-
x = -2:0.01:2
88-
y = -1:0.01:3
89-
contour(x, y, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
104+
x_plot = -2:0.01:2
105+
y_plot = -1:0.01:3
106+
contour(
107+
x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
90108
ratio = :equal, xlims = (-2, 2))
91-
contour!(x, y, (x, y) -> x^2 + y^2, levels = [1], color = :lightblue, line = 4)
92-
savefig("obj_fun_c.png");
93-
nothing; # hide
109+
contour!(x_plot, y_plot, (x, y) -> x^2 + y^2, levels = [1], color = :lightblue, line = 4)
110+
scatter!([u_opt[1]], [u_opt[2]], ms = 10, label = "minimum")
94111
```
95112

96-
![plot of the Rosenbrock function with constraint](obj_fun_c.png)
97-
98-
### Explanation
99-
100-
Equality and inequality constraints can be added to the `OptimizationSystem`. An equality constraint can be specified via an `Equation`, e.g., `x^2 + y^2 ~ 1`. While inequality constraints via an `Inequality`, e.g., `x^2 + y^2 ≲ 1`. The syntax is here `\lesssim` and `\gtrsim`.
101-
102113
## Nested Systems
103114

104115
Needs more text, but it's super cool and auto-parallelizes and sparsifies too.

0 commit comments

Comments
 (0)