Skip to content

Commit c9709db

Browse files
committed
Update docs and examples
1 parent 3b2309d commit c9709db

35 files changed

+989
-1463
lines changed
507 Bytes
Binary file not shown.
741 Bytes
Binary file not shown.

docs/build/doctrees/index.doctree

304 Bytes
Binary file not shown.

docs/build/doctrees/info.doctree

842 Bytes
Binary file not shown.
6.08 KB
Binary file not shown.

docs/build/html/_sources/history.rst.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -49,3 +49,4 @@ Version 1.2.3 (1 Jun 2021)
4949
Version 1.3.0 (16 Oct 2021)
5050
---------------------------
5151
* Handle finitely many arbitrary convex constraints in addition to simple bound constraints.
52+
* Only new functionality is added, so there is no change to the solver for unconstrained/bound-constrained problems.

docs/build/html/_sources/index.rst.txt

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,16 +11,17 @@ DFO-LS: Derivative-Free Optimizer for Least-Squares Minimization
1111

1212
**Author:** `Lindon Roberts <[email protected]>`_
1313

14-
DFO-LS is a flexible package for finding local solutions to nonlinear least-squares minimization problems (with optional convex constraints), without requiring any derivatives of the objective. DFO-LS stands for Derivative-Free Optimizer for Least-Squares.
14+
DFO-LS is a flexible package for finding local solutions to nonlinear least-squares minimization problems (with optional constraints), without requiring any derivatives of the objective. DFO-LS stands for Derivative-Free Optimizer for Least-Squares.
1515

1616
That is, DFO-LS solves
1717

1818
.. math::
1919
2020
\min_{x\in\mathbb{R}^n} &\quad f(x) := \sum_{i=1}^{m}r_{i}(x)^2 \\
21-
\text{s.t.} &\quad x \in C
21+
\text{s.t.} &\quad x \in C\\
22+
&\quad a \leq x \leq b
2223
23-
The constraint set :math:`C` is assumed to be non-empty, closed and convex. Moreover, the constraints are non-relaxable (i.e. DFO-LS will never ask to evaluate a point that is not feasible).
24+
The constraint set :math:`C` is the intersection of multiple convex sets provided as input by the user. All constraints are non-relaxable (i.e. DFO-LS will never ask to evaluate a point that is not feasible).
2425

2526
Full details of the DFO-LS algorithm are given in our paper: C. Cartis, J. Fiala, B. Marteau and L. Roberts, `Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers <https://doi.org/10.1145/3338517>`_, *ACM Transactions on Mathematical Software*, 45:3 (2019), pp. 32:1-32:41 [`preprint <https://arxiv.org/abs/1804.00154>`_] . DFO-LS is a more flexible version of `DFO-GN <https://github.com/numericalalgorithmsgroup/dfogn>`_.
2627

docs/build/html/_sources/info.rst.txt

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,10 +8,11 @@ DFO-LS is designed to solve the nonlinear least-squares minimization problem (wi
88
.. math::
99
1010
\min_{x\in\mathbb{R}^n} &\quad f(x) := \sum_{i=1}^{m}r_{i}(x)^2 \\
11-
\text{s.t.} &\quad x \in C
11+
\text{s.t.} &\quad x \in C\\
12+
&\quad a \leq x \leq b
1213
1314
We call :math:`f(x)` the objective function and :math:`r_i(x)` the residual functions (or simply residuals).
14-
Here :math:`C` is a non-empty and closed convex set.
15+
:math:`C` is the intersection of multiple convex sets given as input by the user.
1516

1617
DFO-LS is a *derivative-free* optimization algorithm, which means it does not require the user to provide the derivatives of :math:`f(x)` or :math:`r_i(x)`, nor does it attempt to estimate them internally (by using finite differencing, for instance).
1718

docs/build/html/_sources/userguide.rst.txt

Lines changed: 90 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -144,7 +144,7 @@ Note that DFO-LS is a randomized algorithm: in its first phase, it builds an int
144144
145145
This and all following problems can be found in the `examples <https://github.com/numericalalgorithmsgroup/dfols/tree/master/examples>`_ directory on the DFO-LS Github page.
146146

147-
Adding Constraints and More Output
147+
Adding Bounds and More Output
148148
-----------------------------
149149
We can extend the above script to add constraints. To add bound constraints alone, we can add the lines
150150

@@ -181,41 +181,6 @@ However, we also get a warning that our starting point was outside of the bounds
181181
182182
DFO-LS automatically fixes this, and moves :math:`x_0` to a point within the bounds, in this case :math:`x_0=(-1.2,0.85)`.
183183

184-
If we want more complex constraints, we can instead write something like the following:
185-
186-
.. code-block:: python
187-
188-
# Define the projection functions
189-
def pball(x):
190-
c = np.array([0.7,1.5]) # ball centre
191-
r = 0.4 # ball radius
192-
return c + (r/np.max([np.linalg.norm(x-c),r]))*(x-c)
193-
194-
def pbox(x):
195-
l = np.array([-2, 1.1]) # lower bound
196-
u = np.array([0.9, 3]) # upper bound
197-
return np.minimum(np.maximum(x,l), u)
198-
199-
# Call DFO-LS (with box and ball constraints)
200-
soln = dfols.solve(rosenbrock, x0, projections=[pball,pbox])
201-
202-
DFO-LS correctly finds the solution to this constrained problem too. Note that we get a warning because the step computed in the trust region subproblem
203-
gave an increase in the model. This is common in the case where multiple constraints are active at the optimal point.
204-
205-
.. code-block:: none
206-
207-
****** DFO-LS Results ******
208-
Solution xmin = [0.9 1.15359245]
209-
Residual vector = [3.43592448 0.1 ]
210-
Objective value f(xmin) = 11.81557703
211-
Needed 10 objective evaluations (at 10 points)
212-
Approximate Jacobian = [[-1.79826221e+01 1.00004412e+01]
213-
[-1.00000000e+00 -1.81976605e-15]]
214-
Exit flag = 5
215-
Warning (trust region increase): Either multiple constraints are active or trust region step gave model increase
216-
****************************
217-
218-
219184
We can also get DFO-LS to print out more detailed information about its progress using the `logging <https://docs.python.org/3/library/logging.html>`_ module. To do this, we need to add the following lines:
220185

221186
.. code-block:: python
@@ -259,6 +224,95 @@ An alternative option available is to get DFO-LS to print to terminal progress i
259224
1 55 1.00e-02 2.00e-01 1.50e-08 1.00e-08 56
260225
1 56 1.00e-02 2.00e-01 1.50e-08 1.00e-08 57
261226
227+
Handling Arbitrary Convex Constraints
228+
-----------------------------
229+
DFO-LS can also handle more general constraints where they can be written as the intersection of finitely many convex sets. For example, the below code
230+
minimizes the Rosenbrock function subject to a constraint set given by the intersection of two convex sets. Note the intersection of the user-provided convex
231+
sets must be non-empty.
232+
233+
.. code-block:: python
234+
235+
'''
236+
DFO-LS example: minimize the Rosenbrock function with arbitrary convex constraints
237+
238+
This example defines two functions pball(x) and pbox(x) that project onto ball and
239+
box constraint sets respectively. It then passes both these functions to the DFO-LS
240+
solver so that it can find a constrained minimizer to the Rosenbrock function.
241+
Such a minimizer must lie in the intersection of constraint sets corresponding to
242+
projection functions pball(x) and pbox(x). The description of the problem is as follows:
243+
244+
min rosenbrock(x)
245+
s.t.
246+
-2 <= x[0] <= 1.1,
247+
1.1 <= x[1] <= 3,
248+
norm(x-c) <= 0.4
249+
250+
where c = [0.7, 1.5] is the centre of the ball.
251+
'''
252+
from __future__ import print_function
253+
import numpy as np
254+
import dfols
255+
256+
# Define the objective function
257+
def rosenbrock(x):
258+
return np.array([10.0 * (x[1] - x[0] ** 2), 1.0 - x[0]])
259+
260+
# Define the starting point
261+
x0 = np.array([-1.2, 1])
262+
263+
'''
264+
Define ball projection function
265+
Projects the input x onto a ball with
266+
centre point (0.7,1.5) and radius 0.4.
267+
'''
268+
def pball(x):
269+
c = np.array([0.7,1.5]) # ball centre
270+
r = 0.4 # ball radius
271+
return c + (r/np.max([np.linalg.norm(x-c),r]))*(x-c)
272+
273+
'''
274+
Define box projection function
275+
Projects the input x onto a box
276+
such that -2 <= x[0] <= 0.9 and
277+
1.1 <= x[1] <= 3.
278+
279+
Note: One could equivalently add bound
280+
constraints as a separate input to the solver
281+
instead.
282+
'''
283+
def pbox(x):
284+
l = np.array([-2, 1.1]) # lower bound
285+
u = np.array([0.9, 3]) # upper bound
286+
return np.minimum(np.maximum(x,l), u)
287+
288+
# For optional extra output details
289+
import logging
290+
logging.basicConfig(level=logging.DEBUG, format='%(message)s')
291+
292+
# Call DFO-LS
293+
soln = dfols.solve(rosenbrock, x0, projections=[pball,pbox])
294+
295+
# Display output
296+
print(soln)
297+
298+
Note that for bound constraints one can choose to either implement them by defining a projection function as above, or by passing the bounds as input like in the example from the section on adding bound constraints.
299+
300+
DFO-LS correctly finds the solution to this constrained problem too. Note that we get a warning because the step computed in the trust region subproblem
301+
gave an increase in the model. This is common in the case where multiple constraints are active at the optimal point.
302+
303+
.. code-block:: none
304+
305+
****** DFO-LS Results ******
306+
Solution xmin = [0.9 1.15359245]
307+
Residual vector = [3.43592448 0.1 ]
308+
Objective value f(xmin) = 11.81557703
309+
Needed 10 objective evaluations (at 10 points)
310+
Approximate Jacobian = [[-1.79826221e+01 1.00004412e+01]
311+
[-1.00000000e+00 -1.81976605e-15]]
312+
Exit flag = 5
313+
Warning (trust region increase): Either multiple constraints are active or trust region step gave model increase
314+
****************************
315+
262316
Example: Noisy Objective Evaluation
263317
-----------------------------------
264318
As described in :doc:`info`, derivative-free algorithms such as DFO-LS are particularly useful when :code:`objfun` has noise. Let's modify the previous example to include random noise in our objective evaluation, and compare it to a derivative-based solver:

docs/build/html/_static/pygments.css

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
pre { line-height: 125%; }
2-
td.linenos pre { color: #000000; background-color: #f0f0f0; padding-left: 5px; padding-right: 5px; }
3-
span.linenos { color: #000000; background-color: #f0f0f0; padding-left: 5px; padding-right: 5px; }
4-
td.linenos pre.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; }
2+
td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; }
3+
span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; }
4+
td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; }
55
span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; }
66
.highlight .hll { background-color: #ffffcc }
77
.highlight { background: #eeffcc; }

0 commit comments

Comments
 (0)