You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/examples/rosenbrock.md
+21-6Lines changed: 21 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,18 +11,33 @@ for common workflows of the package and give copy-pastable starting points.
11
11
the installation and usage of OptimizationOptimJL.jl package, see the
12
12
[Optim.jl page](@ref optim).
13
13
14
-
The objective of this exercise is to determine the values $a$ and $b$ that minimize the Rosenbrock function, which is known to have a global minimum at $(a, a^2)$.
14
+
The objective of this exercise is to determine the $(x, y)$ value pair that minimizes the result of a Rosenbrock function $f$ with some parameter values $a$ and $b$. The Rosenbrock function is useful for testing because it is known*a priori* to have a global minimum at $(a, a^2)$.
15
15
```math
16
-
f(x, y; a, b) = \left(a - x\right)^2 + b \left(y - x^2\right)^2
16
+
f(x,\,y;\,a,\,b) = \left(a - x\right)^2 + b \left(y - x^2\right)^2
17
17
```
18
18
19
-
The domains $x$ and $y$ are first captured as a new vector $\hat{x}$. Parameters $a$ and $b$ are captured as a new vector $\hat{p} and assigned values to produce the desired Rosenbrock function.
19
+
The Optimization.jl interface expects functions to be defined with a vector of optimization arguments $\bar{x}$and a vector of parameters $\bar{p}$, i.e.:
20
20
```math
21
-
\hat{x} = \begin{bmatrix} x \\ y \end{bmatrix} \\
22
-
\hat{p} = \begin{bmatrix} a \\ b \end{bmatrix} = \begin{bmatrix} 1 \\ 100 \end{bmatrix}
Parameters $a$ and $b$ are captured in a vector $\bar{p}$ and assigned some arbitrary values to produce a particular Rosenbrock function to be minimized.
25
+
```math
26
+
\bar{p} = \begin{bmatrix} a \\ b \end{bmatrix} = \begin{bmatrix} 1 \\ 100 \end{bmatrix}
27
+
```
28
+
29
+
The original $x$ and $y$ domains are captured in a vector $\bar{x}$.
30
+
```math
31
+
\bar{x} = \begin{bmatrix} x \\ y \end{bmatrix}
23
32
```
24
33
25
-
An optimization problem can now be defined and solved to estimate the values for $\hat{x}$ that minimize the output of this function.
34
+
An initial estimate $\bar{x}_0$ of the minima location is required to initialize the optimizer.
0 commit comments