@@ -114,8 +114,8 @@ Hence $y$ parameterises $x$ in the null space.
114114
115115Active set methods form and operate with $Y$ and $Z$ by identifying a
116116matrix $V\in\R^{(n-r)\times n}$ such that
117- $B_ {\cal{A}}\displaystyle \left[ \begin{matrix}A_ {\cal{A}}\\ V\end{matrix}\right] $
118- is well conditioned, so
117+ $\displaystyle B_ {\cal{A}}= \left[ \begin{matrix}A_ {\cal{A}}\\ V\end{matrix}\right] $
118+ is well conditioned, so that
119119$\left[ \begin{matrix}Y&Z\end{matrix}\right] =B_ {\cal{A}}^{-1}$. Note
120120that $Y$ and $Z$ are not formed explicitly since, for example,
121121$Yb_ {\cal{A}}$ can be formed by factorizing $B$ and solving
@@ -133,6 +133,53 @@ Z^TQZy = -Z^T(c+QYb_{\cal{A}})\qquad(3)
133133$$
134134The matrix $Z^TQZ$ is known as the ** reduced Hessian** .
135135
136+ ### Outline ASM algorithm
136137
137-
138- and its properties are critical to the performance of
138+ The following algorithmic definition is from Fletcher's _ Practical
139+ Methods of Optimization_ , Chapter 10, which also illustrates it
140+ qualitatively. Key to the definition is the ** equality problem** ,
141+ $$
142+ \min~ \frac{1}{2}\delta^TQ\delta + \delta^Tg^{(k)}\quad \textrm{s.t.}~ a_i^T\delta=0, i\in\cal{A},
143+ $$
144+ where $g^{(k)}=Qx^{(k)}+c$. This problem minimizes the QP objective in the null space of the active constraints at a point $x^{(k)}$. Also, the set of inequalities in the QP is denoted $\cal{I}$.
145+
146+ 1 . Given feasible $x^{(1)}$ and $\cal{A}$, set $k=1$
147+ 2 . If $\delta=0$ does not solve the equality proble, go to 4.
148+ 3 . Compute Lagrange multipliers $\lambda^{(k)}$ for the equality
149+ problem and use the following to determine $q$. If
150+ $\textrm{sgn}(q)\lambda^{(k)}_ q\ge0$, then terminate with
151+ $x^* =x^{(k)}$, otherwise remove $q$ from $\cal{A}$.
152+ $$
153+ q=\argmin_{\cal{A}\cap\cal{I}} \textrm{sgn}(i)\lambda^{(k)}_i\quad\textrm{where}~\textrm{sgn}(i) = \begin{cases}\phantom{-}1,~~a_i^Tx=L_i\\ -1,~~a_i^Tx=U_i\end{cases}
154+ $$
155+ 4 . Solve the equality problem for $s^{(k)}$
156+ 5 . Find $\alpha^{(k)}$ to solve the following ** linesearch problem** and set $x^{(k+1)} = x^{(k)} + \alpha^{(k)}s^{(k)}$.
157+ $$
158+ \alpha^{(k)} = \min\left(1,
159+ \min_{i: i\not\in\cal{A}, a_i^Ts^{(k)}<0} \frac{L_i-a_i^Tx^{(k)}}{a_i^Ts^{(k)}},
160+ \min_{i: i\not\in\cal{A}, a_i^Ts^{(k)}>0} \frac{U_i-a_i^Tx^{(k)}}{a_i^Ts^{(k)}}
161+ \right)
162+ $$
163+ 6 . If $\alpha^{(k)}<1$ then add $p$ to $\cal{A}$, where $p\not\in\cal{A}$ yields $\alpha^{(k)}$
164+ 7 . Set $k=k+1$ and go to 2.
165+
166+ In essence, the algorithm solves the equality problem repeatedly,
167+ allowing constraints to become inactive if they will improve the
168+ objective (deducing optimality if none will do so) and forcing
169+ constraints to be active if they would be violated at the minimizer of
170+ the equality problem. It can be seen to be a generalisation of the
171+ primal simplex method, where nonbasic variables are allowed to move
172+ from bounds at vertices to improve the objective, to be replaced by
173+ basic variables which reach bounds. Since the primal simplex method is
174+ always at a vertex, the null space is trivial, and since there is no
175+ local minimizer along an edge of the polytope, a limiting basic
176+ variables is always found (for an LP that is not unbounded).
177+
178+ ### Practicalities
179+
180+ The algorithm assumes that $x^{(1)}$ is feasible. This can be found by
181+ using the simplex algorithm to solve the LP feasibility problem. This
182+ yields a vertex solution, so the initial null space is empty. The
183+ simplex nonbasic variables yield $\cal{A}$, for which the matrix
184+ $B_ {\cal{A}}$ is nonsingular since it has the same kernel as the
185+ simplex basis matrix.
0 commit comments