|
10 | 10 |
|
11 | 11 | The main tools in this Python optimization toolbox are: |
12 | 12 |
|
13 | | -* **function `sambo.minimize()`**, a drop-in replacement for [`scipy.optimize.minimize()`][sp_opt_min], |
| 13 | +* **function `sambo.minimize()`**, a near drop-in replacement for [`scipy.optimize.minimize()`][sp_opt_min], |
14 | 14 | * **class `Optimizer`** with an ask-and-tell user interface, |
15 | 15 | supporting arbitrary scikit-learn-like surrogate models, |
16 | | - with Bayesian optimization estimators like [gaussian process], [extra trees], |
17 | | - and [kernel ridge regression] built in, |
| 16 | + with Bayesian optimization estimators like [gaussian process] and [extra trees], |
| 17 | + built in, |
18 | 18 | * **`SamboSearchCV`**, a much faster drop-in replacement for |
19 | 19 | scikit-learn's [`GridSearchCV`][skl_gridsearchcv] and similar exhaustive |
20 | 20 | machine-learning hyper-parameter tuning methods, |
21 | | - but compared to stochastic methods, precise. |
| 21 | + but compared to unpredictable stochastic methods, _informed_. |
22 | 22 |
|
23 | 23 | The algorithms and methods implemented by or used in this package are: |
24 | 24 |
|
|
35 | 35 |
|
36 | 36 | The project is one of the better optimizers around according to [benchmark]. |
37 | 37 |
|
38 | | -\N{DAGGER} The contained algorithms seek _minimums_ of your objective `f(x)`. |
39 | | -If you instead want to find the _maximum_, simply minimize `-f(x)`. 💡 |
| 38 | +\N{DAGGER} The contained algorithms seek to _minimize_ your objective `f(x)`. |
| 39 | +If you instead need the _maximum_, simply minimize `-f(x)`. 💡 |
40 | 40 |
|
41 | 41 | [gaussian process]: https://www.gaussianprocess.org/gpml/chapters/RW.pdf |
42 | 42 | [extra trees]: https://doi.org/10.1007/s10994-006-6226-1 |
|
0 commit comments