Skip to content

Commit 455c521

Browse files
author
brendan
committed
some updates to contraint notes
1 parent 419fc41 commit 455c521

File tree

1 file changed

+26
-13
lines changed

1 file changed

+26
-13
lines changed

examples/constraints.ipynb

Lines changed: 26 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -6,22 +6,22 @@
66
"source": [
77
"# Constrained Optimization\n",
88
"\n",
9-
"Constrained optimization refers to situations in which you must for instance maximize `f`, a function of `x` and `y`, but the solution must lie in a region where for instance `x<y`.\n",
9+
"Constrained optimization refers to situations in which you must for instance maximize \"f\", a function of \"x\" and \"y\", but the solution must lie in a region where for instance \"x<y\".\n",
1010
"\n",
1111
"There are two distinct situations you may find yourself in:\n",
1212
"\n",
13-
"1. Simple, cheap contraints: in this case, you know whether or not a given solution violates your constraints **before** you even assess it. In this case, you can codify your constraints directly into the objective function and you should read **1** below.\n",
14-
"2. Expensive contraints - in other situations, you may not know whether or not a given solution violates your constraints until you have explicitly evaluate the objective function there - which is typically an expensive operation. In such situations, it is desirable to **learn** the constrained regions on the fly in order to avoid unnecessary expensive calls to the objective function. The way to handle these situations is descrived in **2. Advanced Constrained Optimization**\n",
13+
"1. Simple, cheap constraints: in this case, you know whether or not a given solution violates your constraints **before** you even assess it. In this case, you can codify your constraints directly into the objective function and you should read **1** below.\n",
14+
"2. Expensive constraints - in other situations, you may not know whether or not a given solution violates your constraints until you have explicitly evaluate the objective function there - which is typically an expensive operation. In such situations, it is desirable to **learn** the constrained regions on the fly in order to avoid unnecessary expensive calls to the objective function. The way to handle these situations is described in **2. Advanced Constrained Optimization**\n",
1515
"\n",
1616
"\n",
1717
"# 1. Simple Constrained Optimization\n",
1818
"\n",
19-
"In situations where you know in advance whether or not a given point violates your constraints, you can normally simply code them directly into the objective function. To demonstrate this, let's start with a standart non-contrained optimziation:"
19+
"In situations where you know in advance whether or not a given point violates your constraints, you can normally simply code them directly into the objective function. To demonstrate this, let's start with a standard non-constrained optimization:"
2020
]
2121
},
2222
{
2323
"cell_type": "code",
24-
"execution_count": 14,
24+
"execution_count": 1,
2525
"metadata": {},
2626
"outputs": [
2727
{
@@ -65,14 +65,14 @@
6565
},
6666
{
6767
"cell_type": "code",
68-
"execution_count": 15,
68+
"execution_count": 3,
6969
"metadata": {},
7070
"outputs": [
7171
{
7272
"name": "stdout",
7373
"output_type": "stream",
7474
"text": [
75-
"the best solution with no constraints is {'target': -4.226389709162731, 'params': {'x': 2.0042130207949307, 'y': 2.099781740364328}}\n"
75+
"the best solution with y>x is {'target': -4.226389709162731, 'params': {'x': 2.0042130207949307, 'y': 2.099781740364328}}\n"
7676
]
7777
}
7878
],
@@ -95,7 +95,7 @@
9595
" n_iter=100\n",
9696
")\n",
9797
"\n",
98-
"print(f'the best solution with no constraints is {optimizer.max}')"
98+
"print(f'the best solution with y>x is {optimizer.max}')"
9999
]
100100
},
101101
{
@@ -109,14 +109,19 @@
109109
},
110110
{
111111
"cell_type": "code",
112-
"execution_count": 12,
112+
"execution_count": 7,
113113
"metadata": {},
114114
"outputs": [
115115
{
116116
"name": "stdout",
117117
"output_type": "stream",
118118
"text": [
119-
"the best solution with no constraints is {'target': -4.0, 'params': {'x': 2.0}}\n"
119+
"\u001b[91mData point [2.] is not unique. 1 duplicates registered. Continuing ...\u001b[0m\n",
120+
"\u001b[91mData point [2.] is not unique. 2 duplicates registered. Continuing ...\u001b[0m\n",
121+
"\u001b[91mData point [2.] is not unique. 3 duplicates registered. Continuing ...\u001b[0m\n",
122+
"\u001b[91mData point [2.] is not unique. 4 duplicates registered. Continuing ...\u001b[0m\n",
123+
"\u001b[91mData point [2.] is not unique. 5 duplicates registered. Continuing ...\u001b[0m\n",
124+
"the best solution with y=4-x is {'target': -4.0, 'params': {'x': 2.0}}\n"
120125
]
121126
}
122127
],
@@ -133,15 +138,23 @@
133138
" f=surrogate_objective,\n",
134139
" pbounds=pbounds,\n",
135140
" random_state=0,\n",
136-
" verbose=0\n",
141+
" verbose=0,\n",
142+
" allow_duplicate_points=True\n",
137143
")\n",
138144
"\n",
139145
"optimizer.maximize(\n",
140146
" init_points=2,\n",
141-
" n_iter=100\n",
147+
" n_iter=10\n",
142148
")\n",
143149
"\n",
144-
"print(f'the best solution with no constraints is {optimizer.max}')"
150+
"print(f'the best solution with y=4-x is {optimizer.max}')"
151+
]
152+
},
153+
{
154+
"cell_type": "markdown",
155+
"metadata": {},
156+
"source": [
157+
"> Note: in this last example, we have set `allow_duplicate_points=True`. The reason we are getting some duplicate points in this example is probably because the space is now so constrained that the optimizer quickly hones in on only one 'interesting' point and repeatedly probes it. The default behavior in these cases is `allow_duplicate_points=False` which will raise an error when a duplicate is registered."
145158
]
146159
},
147160
{

0 commit comments

Comments
 (0)