|
3 | 3 | ======================================== |
4 | 4 | In this tutorial we focus on a modification of the `Quadratic program |
5 | 5 | with box constraints` tutorial where the quadratic function is replaced by a |
6 | | -nonlinear function. For this example we will use the well-known Rosenbrock |
7 | | -function: |
| 6 | +nonlinear function: |
8 | 7 |
|
9 | 8 | .. math:: |
10 | 9 | \mathbf{x} = \argmin_\mathbf{x} f(\mathbf{x}) \quad \text{s.t.} \quad \mathbf{x} |
11 | 10 | \in \mathcal{I}_{\operatorname{Box}} |
12 | 11 |
|
| 12 | +For this example we will use the well-known Rosenbrock |
| 13 | +function: |
| 14 | +
|
| 15 | + .. math:: |
| 16 | + f(\mathbf{x}) = (a - x)^2 + b(y - x^2)^2 |
| 17 | +
|
| 18 | +where :math:`\mathbf{x}=[x, y]`, :math:`a=1`, and :math:`b=10`. |
| 19 | +
|
13 | 20 | We will learn how to handle nonlinear functionals in convex optimization, and |
14 | 21 | more specifically dive into the details of the |
15 | 22 | :class:`pyproximal.proximal.Nonlinear` operator. This is a template operator |
@@ -199,9 +206,9 @@ def callback(x): |
199 | 206 |
|
200 | 207 | fig, ax = contour_rosenbrock(x, y) |
201 | 208 | steps = np.array(steps) |
202 | | -ax.plot(steps[:, 0], steps[:, 1], '.-k', lw=2, ms=20, alpha=0.4) |
203 | 209 | ax.contour(X, Y, indic, colors='k') |
204 | 210 | ax.scatter(1, 1, c='k', s=300) |
| 211 | +ax.plot(steps[:, 0], steps[:, 1], '.-k', lw=2, ms=20, alpha=0.4, label='GD') |
205 | 212 | ax.plot(xhist_pg[:, 0], xhist_pg[:, 1], '.-b', ms=20, lw=2, label='PG') |
206 | 213 | ax.plot(xhist_admm[:, 0], xhist_admm[:, 1], '.-g', ms=20, lw=2, label='ADMM') |
207 | 214 | ax.plot(xhist_admm_lbfgs[:, 0], xhist_admm_lbfgs[:, 1], '.-m', ms=20, lw=2, |
|
0 commit comments