Skip to content

Commit 9607c7f

Browse files
committed
Update README
This commit adds a disclaimer to the readme file telling people of api breaking changing. Moreover it updates the quick start list with new files and add a good chunck of the basic tour to the readme file.
1 parent 39d1d42 commit 9607c7f

File tree

1 file changed

+217
-11
lines changed

1 file changed

+217
-11
lines changed

README.md

Lines changed: 217 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -15,22 +15,27 @@ function in as few iterations as possible. This technique is particularly
1515
suited for optimization of high cost functions, situations where the balance
1616
between exploration and exploitation is important.
1717

18+
## Important notice
19+
With the release of version 1.0.0 a number of breaking API changes were introduced. I understand this can be a headache for some, but these were necessary changes that needed to be done and ultimately made the package better. If you have used this package in the past I suggest you take the basic and advanced tours (found in the examples folder) in order to familiarize yourself with the new API.
20+
1821
## Quick Start
19-
In the [examples](https://github.com/fmfn/BayesianOptimization/tree/master/examples)
20-
folder you can get a grip of how the method and this package work by:
21-
- Checking out this
22+
See below for a quick tour over the basics of the Bayesian Optimization package. More detailed information, other advanced features, and tips on usage/implementation can be found in the [examples](https://github.com/fmfn/BayesianOptimization/tree/master/examples) folder. I suggest that you:
23+
- Follow the
24+
[basic tour notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/basic-tour.ipynb)
25+
to learn how to use the package's most important features.
26+
- Take a look at the
27+
[advanced tour notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/advanced-tour.ipynb)
28+
to learn how to make the package more flexible, how to deal with categorical parameters, how to use observers, and more.
29+
- Check out this
2230
[notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/visualization.ipynb)
2331
with a step by step visualization of how this method works.
24-
- Going over this
25-
[script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/usage.py)
26-
to become familiar with this package's basic functionalities.
27-
- Exploring this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/exploitation%20vs%20exploration.ipynb)
32+
- Explore this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/exploitation%20vs%20exploration.ipynb)
2833
exemplifying the balance between exploration and exploitation and how to
2934
control it.
30-
- Checking out these scripts ([sklearn](https://github.com/fmfn/BayesianOptimization/blob/master/examples/sklearn_example.py),
31-
[xgboost](https://github.com/fmfn/BayesianOptimization/blob/master/examples/xgboost_example.py))
32-
for examples of how to use this package to tune parameters of ML estimators
33-
using cross validation and bayesian optimization.
35+
- Go over this [script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/sklearn_example.py)
36+
for examples of how to tune parameters of Machine Learning models using cross validation and bayesian optimization.
37+
- Finally, take a look at this [script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/async_optimization.py)
38+
for ideas on how to implement bayesian optimization in a distributed fashion using this package.
3439

3540

3641
## How does it work?
@@ -48,6 +53,207 @@ This process is designed to minimize the number of steps required to find a comb
4853
This project is under active development, if you find a bug, or anything that
4954
needs correction, please let me know.
5055

56+
57+
Basic tour of the Bayesian Optimization package
58+
===============================================
59+
60+
## 1. Specifying the function to be optimized
61+
62+
This is a function optimization package, therefore the first and most important ingreedient is, of course, the function to be optimized.
63+
64+
**DISCLAIMER:** We know exactly how the output of the function below depends on its parameter. Obviously this is just an example, and you shouldn't expect to know it in a real scenario. However, it should be clear that you don't need to. All you need in order to use this package (and more generally, this technique) is a function `f` that takes a known set of parameters and outputs a real number.
65+
66+
67+
```python
68+
def black_box_function(x, y):
69+
"""Function with unknown internals we wish to maximize.
70+
71+
This is just serving as an example, for all intents and
72+
purposes think of the internals of this function, i.e.: the process
73+
which generates its output values, as unknown.
74+
"""
75+
return -x ** 2 - (y - 1) ** 2 + 1
76+
```
77+
78+
## 2. Getting Started
79+
80+
All we need to get started is to instanciate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work
81+
82+
83+
```python
84+
from bayes_opt import BayesianOptimization
85+
86+
# Bounded region of parameter space
87+
pbounds = {'x': (2, 4), 'y': (-3, 3)}
88+
89+
optimizer = BayesianOptimization(
90+
f=black_box_function,
91+
pbounds=pbounds,
92+
random_state=1,
93+
)
94+
```
95+
96+
The BayesianOptimization object will work all of the box without much tuning needed. The main method you should be aware of is `maximize`, which does exactly what you think it does.
97+
98+
There are many parameters you can pass to maximize, nonetheless, the most important ones are:
99+
- `n_iter`: How many steps of bayesian optimization you want to perform. The more steps the more likely to find a good maximum you are.
100+
- `init_points`: How many steps of **random** exploration you want to perform. Random exploration can help by diversifying the exploration space.
101+
102+
103+
```python
104+
optimizer.maximize(
105+
init_points=2,
106+
n_iter=3,
107+
)
108+
```
109+
110+
| iter | target | x | y |
111+
-------------------------------------------------
112+
|  1  | -7.135  |  2.834  |  1.322  |
113+
|  2  | -7.78  |  2.0  | -1.186  |
114+
|  3  | -19.0  |  4.0  |  3.0  |
115+
|  4  | -16.3  |  2.378  | -2.413  |
116+
|  5  | -4.441  |  2.105  | -0.005822 |
117+
=================================================
118+
119+
120+
The best combination of parameters and target value found can be accessed via the property `optimizer.max`.
121+
122+
123+
```python
124+
print(optimizer.max)
125+
>>> {'target': -4.441293113411222, 'params': {'y': -0.005822117636089974, 'x': 2.104665051994087}}
126+
```
127+
128+
129+
While the list of all parameters probed and their corresponding target values is available via the property `optimizer.res`.
130+
131+
132+
```python
133+
for i, res in enumerate(optimizer.res):
134+
print("Iteration {}: \n\t{}".format(i, res))
135+
136+
>>> Iteration 0:
137+
>>> {'target': -7.135455292718879, 'params': {'y': 1.3219469606529488, 'x': 2.8340440094051482}}
138+
>>> Iteration 1:
139+
>>> {'target': -7.779531005607566, 'params': {'y': -1.1860045642089614, 'x': 2.0002287496346898}}
140+
>>> Iteration 2:
141+
>>> {'target': -19.0, 'params': {'y': 3.0, 'x': 4.0}}
142+
>>> Iteration 3:
143+
>>> {'target': -16.29839645063864, 'params': {'y': -2.412527795983739, 'x': 2.3776144540856503}}
144+
>>> Iteration 4:
145+
>>> {'target': -4.441293113411222, 'params': {'y': -0.005822117636089974, 'x': 2.104665051994087}}
146+
```
147+
148+
149+
### 2.1 Changing bounds
150+
151+
During the optimization process you may realize the bounds chosen for some parameters are not adequate. For these situations you can invoke the method `set_bounds` to alter them. You can pass any combination of **existing** parameters and their associated new bounds.
152+
153+
154+
```python
155+
optimizer.set_bounds(new_bounds={"x": (-2, 3)})
156+
157+
optimizer.maximize(
158+
init_points=0,
159+
n_iter=5,
160+
)
161+
```
162+
163+
| iter | target | x | y |
164+
-------------------------------------------------
165+
| 6 | -5.145 | 2.115 | -0.2924 |
166+
| 7 | -5.379 | 2.337 | 0.04124 |
167+
|  8 | -3.581 |  1.874 | -0.03428 |
168+
|  9 | -2.624 |  1.702 |  0.1472 |
169+
|  10 | -1.762 |  1.442 |  0.1735 |
170+
=================================================
171+
172+
173+
## 3. Guiding the optimization
174+
175+
It is often the case that we have an idea of regions of the parameter space where the maximum of our function might lie. For these situations the `BayesianOptimization` object allows the user to specify specific points to be probed. By default these will be explored lazily (`lazy=True`), meaning these points will be evaluated only the next time you call `maximize`. This probing process happens before the gaussian process takes over.
176+
177+
Parameters can be passed as dictionaries or as an iterable.
178+
179+
```python
180+
optimizer.probe(
181+
params={"x": 0.5, "y": 0.7},
182+
lazy=True,
183+
)
184+
185+
optimizer.probe(
186+
params=[-0.3, 0.1],
187+
lazy=True,
188+
)
189+
190+
# Will probe only the two points specified above
191+
optimizer.maximize(init_points=0, n_iter=0)
192+
```
193+
194+
| iter | target | x | y |
195+
-------------------------------------------------
196+
| 11 | 0.66 | 0.5 | 0.7 |
197+
| 12 | 0.1 | -0.3 | 0.1 |
198+
=================================================
199+
200+
201+
## 4. Saving, loading and restarting
202+
203+
By default you can follow the progress of your optimization by setting `verbose>0` when instanciating the `BayesianOptimization` object. If you need more control over logging/alerting you will need to use an observer. For more information about observers checkout the advanced tour notebook. Here we will only see how to use the native `JSONLogger` object to save to and load progress from files.
204+
205+
### 4.1 Saving progress
206+
207+
208+
```python
209+
from bayes_opt.observer import JSONLogger
210+
from bayes_opt.event import Events
211+
```
212+
213+
The observer paradigm works by:
214+
1. Instantiating an observer object.
215+
2. Tying the observer object to a particular event fired by an optimizer.
216+
217+
The `BayesianOptimization` object fires a number of internal events during optimization, in particular, everytime it probes the function and obtains a new parameter-target combination it will fire an `Events.OPTIMIZATION_STEP` event, which our logger will listen to.
218+
219+
**Caveat:** The logger will not look back at previously probed points.
220+
221+
222+
```python
223+
logger = JSONLogger(path="./logs.json")
224+
optimizer.subscribe(Events.OPTMIZATION_STEP, logger)
225+
226+
# Results will be saved in ./logs.json
227+
optimizer.maximize(
228+
init_points=2,
229+
n_iter=3,
230+
)
231+
```
232+
233+
### 4.2 Loading progress
234+
235+
Naturally, if you stored progress you will be able to load that onto a new instance of `BayesianOptimization`. The easiest way to do it is by invoking the `load_logs` function, from the `util` submodule.
236+
237+
238+
```python
239+
from bayes_opt.util import load_logs
240+
241+
242+
new_optimizer = BayesianOptimization(
243+
f=black_box_function,
244+
pbounds={"x": (-2, 2), "y": (-2, 2)},
245+
verbose=2,
246+
random_state=7,
247+
)
248+
249+
# New optimizer is loaded with previously seen points
250+
load_logs(new_optimizer, logs=["./logs.json"]);
251+
```
252+
253+
## Next Steps
254+
255+
This introduction covered the most basic functionality of the package. Checkout the `basic-tour` and `advanced-tour` notebooks in the example folder, yhere you will more detailed explanations and other more advanced functionality. Also, browse the examples folder for implementation tips and ideas.
256+
51257
Installation
52258
============
53259

0 commit comments

Comments
 (0)