Skip to content

Commit 52a2be5

Browse files
committed
improves readme
1 parent d795787 commit 52a2be5

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

README.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ function in as few iterations as possible. This technique is particularly
1111
suited for optimization of high cost functions, situations where the balance
1212
between exploration and exploitation is important.
1313

14+
## Quick Start
1415
To get a grip of how this method and package works in the [examples](https://github.com/fmfn/BayesianOptimization/tree/master/examples)
1516
folder you can:
1617
- Checkout this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/visualization.ipynb)
@@ -23,13 +24,20 @@ control it.
2324
- Checkout these scripts ([sklearn](https://github.com/fmfn/BayesianOptimization/blob/master/examples/sklearn_example.py),
2425
[xgboost](https://github.com/fmfn/BayesianOptimization/blob/master/examples/xgboost_example.py))
2526
for examples of how to use this package to tune parameters of ML estimators
26-
using cross validation and bayesian optimization
27+
using cross validation and bayesian optimization.
2728

2829

30+
## How does it work?
31+
32+
Bayesian optimization works by constructing a posterior distribution of functions (gaussian process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not, as seen in the picture below.
33+
2934
![BayesianOptimization in action](https://github.com/fmfn/BayesianOptimization/blob/master/examples/bo_example.png)
3035

36+
As you iterate over and over, the algorithm balances its needs of exploration and exploitation taking into account what it knows about the target function. At each step a Gaussian Process is fitted to the known samples (points previously explored), and the posterior distribution, combined with a exploration strategy (such as UCB (Upper Confidence Bound), or EI (Expected Improvement)), are used to determine the next point that should be explored (see the gif below).
37+
3138
![BayesianOptimization in action](https://github.com/fmfn/BayesianOptimization/blob/master/examples/bayesian_optimization.gif)
3239

40+
This process is designed to minimize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. See the references for a proper discussion of this method.
3341

3442
This project is under active development, if you find a bug, or anything that
3543
needs correction, please let me know.

0 commit comments

Comments
 (0)