@@ -19,39 +19,28 @@ Let's optimize the classical _Rosenbrock function_ in two dimensions.
1919``` @example optimization
2020using ModelingToolkit, Optimization, OptimizationOptimJL
2121@variables begin
22- x, [bounds = (-2.0, 2.0), guess = 1.0 ]
23- y, [bounds = (-1.0, 3.0), guess = 3.0 ]
22+ x = 1.0 , [bounds = (-2.0, 2.0)]
23+ y = 3.0 , [bounds = (-1.0, 3.0)]
2424end
25- @parameters a=1 b=1
25+ @parameters a=1.0 b=1.0
2626rosenbrock = (a - x)^2 + b * (y - x^2)^2
2727@mtkbuild sys = OptimizationSystem(rosenbrock, [x, y], [a, b])
2828```
2929
3030Every optimization problem consists of a set of optimization variables.
31- In this case, we create two variables: ` x ` and ` y ` .
31+ In this case, we create two variables: ` x ` and ` y ` ,
32+ with initial guesses ` 1 ` and ` 3 ` for their optimal values.
3233Additionally, we assign box constraints for each of them, using ` bounds ` ,
33- as well as an initial guess for their optimal values, using ` guess ` .
34- Both bounds and guess are called symbolic metadata.
34+ Bounds is an example of symbolic metadata.
3535Fore more information, take a look at the symbolic metadata
36- [ documentation page] ( symbolic_metadata ) .
36+ [ documentation page] (@ ref symbolic_metadata).
3737
3838We also create two parameters with ` @parameters ` .
3939Parameters are useful if you want to solve the same optimization problem multiple times,
4040with different values for these parameters.
4141Default values for these parameters can also be assigned, here ` 1 ` is used for both ` a ` and ` b ` .
4242These optimization values and parameters are used in an objective function, here the Rosenbrock function.
4343
44- A visualization of the Rosenbrock function is depicted below.
45-
46- ``` @example optimization
47- using Plots
48- x_plot = -2:0.01:2
49- y_plot = -1:0.01:3
50- contour(
51- x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
52- ratio = :equal, xlims = (-2, 2))
53- ```
54-
5544Next, the actual ` OptimizationProblem ` can be created.
5645The initial guesses for the optimization variables can be overwritten, via an array of ` Pairs ` ,
5746in the second argument of ` OptimizationProblem ` .
@@ -64,10 +53,20 @@ u0 = [y => 2.0]
6453p = [b => 100.0]
6554
6655prob = OptimizationProblem(sys, u0, p, grad = true, hess = true)
67- solve(prob, GradientDescent())
56+ u_opt = solve(prob, GradientDescent())
6857```
6958
70- We see that the optimization result corresponds to the minimum in the figure.
59+ A visualization of the Rosenbrock function is depicted below.
60+
61+ ``` @example optimization
62+ using Plots
63+ x_plot = -2:0.01:2
64+ y_plot = -1:0.01:3
65+ contour(
66+ x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
67+ ratio = :equal, xlims = (-2, 2))
68+ scatter!([u_opt[1]], [u_opt[2]], ms = 10, label = "minimum")
69+ ```
7170
7271## Rosenbrock Function with Constraints
7372
@@ -79,10 +78,10 @@ Let's add an inequality constraint to the previous example:
7978using ModelingToolkit, Optimization, OptimizationOptimJL
8079
8180@variables begin
82- x, [bounds = (-2.0, 2.0), guess = 1.0 ]
83- y, [bounds = (-1.0, 3.0), guess = 2.0 ]
81+ x = 0.14 , [bounds = (-2.0, 2.0)]
82+ y = 0.14 , [bounds = (-1.0, 3.0)]
8483end
85- @parameters a=1 b=100
84+ @parameters a=1.0 b=100.0
8685rosenbrock = (a - x)^2 + b * (y - x^2)^2
8786cons = [
8887 x^2 + y^2 ≲ 1
0 commit comments