Skip to content

Commit 7385102

Browse files
committed
minor typo fixes
1 parent a470d2d commit 7385102

File tree

3 files changed

+15
-17
lines changed

3 files changed

+15
-17
lines changed

README.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -12,16 +12,18 @@ suited for optimization of high cost functions, situations where the balance
1212
between exploration and exploitation is important.
1313

1414
## Quick Start
15-
To get a grip of how this method and package works in the [examples](https://github.com/fmfn/BayesianOptimization/tree/master/examples)
16-
folder you can:
17-
- Checkout this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/visualization.ipynb)
15+
In the [examples](https://github.com/fmfn/BayesianOptimization/tree/master/examples)
16+
folder you can get a grip of how the method and this package work by:
17+
- Checking out this
18+
[notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/visualization.ipynb)
1819
with a step by step visualization of how this method works.
19-
- Go over this [script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/usage.py)
20+
- Going over this
21+
[script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/usage.py)
2022
to become familiar with this packages basic functionalities.
21-
- Explore this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/exploitation%20vs%20exploration.ipynb)
23+
- Exploring this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/exploitation%20vs%20exploration.ipynb)
2224
exemplifying the balance between exploration and exploitation and how to
2325
control it.
24-
- Checkout these scripts ([sklearn](https://github.com/fmfn/BayesianOptimization/blob/master/examples/sklearn_example.py),
26+
- Checking out these scripts ([sklearn](https://github.com/fmfn/BayesianOptimization/blob/master/examples/sklearn_example.py),
2527
[xgboost](https://github.com/fmfn/BayesianOptimization/blob/master/examples/xgboost_example.py))
2628
for examples of how to use this package to tune parameters of ML estimators
2729
using cross validation and bayesian optimization.

bayes_opt/bayesian_optimization.py

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -121,11 +121,9 @@ def init(self, init_points):
121121
self.initialized = True
122122

123123
def explore(self, points_dict):
124-
"""
125-
Method to explore user defined points
124+
"""Method to explore user defined points
126125
127126
:param points_dict:
128-
:return:
129127
"""
130128

131129
# Consistency check
@@ -150,11 +148,9 @@ def explore(self, points_dict):
150148

151149
def initialize(self, points_dict):
152150
"""
153-
Method to introduce point for which the target function
154-
value is known
151+
Method to introduce points for which the target function value is known
155152
156153
:param points_dict:
157-
:return:
158154
"""
159155

160156
for points in zip(*(points_dict[k] for k in sorted(points_dict))):

examples/usage.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,13 @@
1+
"""Example of how to use this bayesian optimization package."""
2+
13
import sys
24
sys.path.append("./")
35
from bayes_opt import BayesianOptimization
4-
# Example of how to use this bayesian optimization package.
56

67
# Lets find the maximum of a simple quadratic function of two variables
78
# We create the bayes_opt object and pass the function to be maximized
89
# together with the parameters names and their bounds.
9-
bo = BayesianOptimization(lambda x, y: -x**2 - (y - 1)**2 + 1,
10+
bo = BayesianOptimization(lambda x, y: -x ** 2 - (y - 1) ** 2 + 1,
1011
{'x': (-4, 4), 'y': (-3, 3)})
1112

1213
# One of the things we can do with this object is pass points
@@ -18,8 +19,8 @@
1819
# Additionally, if we have any prior knowledge of the behaviour of
1920
# the target function (even if not totally accurate) we can also
2021
# tell that to the optimizer.
21-
# Here we pass a dictionary with target values as keys of another
22-
# dictionary with parameters names and their corresponding value.
22+
# Here we pass a dictionary with 'target' and parameter names as keys and a
23+
# list of corresponding values
2324
bo.initialize(
2425
{
2526
'target': [-1, -1],
@@ -36,7 +37,6 @@
3637
# The output values can be accessed with self.res
3738
print(bo.res['max'])
3839

39-
4040
# If we are not satisfied with the current results we can pickup from
4141
# where we left, maybe pass some more exploration points to the algorithm
4242
# change any parameters we may choose, and the let it run again.

0 commit comments

Comments
 (0)