|
13 | 13 | * **function `sambo.minimize()`**, a near drop-in replacement for [`scipy.optimize.minimize()`][sp_opt_min], |
14 | 14 | * **class `Optimizer`** with an ask-and-tell user interface, |
15 | 15 | supporting arbitrary scikit-learn-like surrogate models, |
16 | | - with Bayesian optimization estimators like [gaussian process] and [extra trees], |
| 16 | + with **Bayesian optimization estimators like [Gaussian processes] and [Extra Trees]** |
17 | 17 | built in, |
18 | | -* **`SamboSearchCV`**, a much faster drop-in replacement for |
19 | | - scikit-learn's [`GridSearchCV`][skl_gridsearchcv] and similar exhaustive |
| 18 | +* **`SamboSearchCV`**, a much faster **drop-in replacement** for |
| 19 | + scikit-learn's **[`GridSearchCV`][skl_gridsearchcv]** and similar exhaustive |
20 | 20 | machine-learning hyper-parameter tuning methods, |
21 | 21 | but compared to unpredictable stochastic methods, _informed_. |
22 | 22 |
|
23 | 23 | The algorithms and methods implemented by or used in this package are: |
24 | 24 |
|
25 | | -* [simplical homology global optimization] (SHGO), customizing the [implementation from SciPy], |
26 | | -* surrogate machine learning model-based optimization, |
27 | | -* [shuffled complex evolution] (SCE-UA with improvements). |
| 25 | +* **[simplical homology global optimization] (SHGO)**, reinitializing the [implementation from SciPy], |
| 26 | +* **surrogate** machine learning **model**-based optimization, |
| 27 | +* [shuffled complex evolution] (**SCE-UA** with improvements). |
28 | 28 |
|
29 | 29 | [simplical homology global optimization]: http://doi.org/10.1007/s10898-018-0645-y |
30 | 30 | [implementation from SciPy]: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.shgo.html |
31 | 31 | [shuffled complex evolution]: https://doi.org/10.1007/BF00939380 |
32 | 32 |
|
33 | | -This open-source project was heavily **inspired by _scikit-optimize_** project, |
34 | | -which now seems helplessly defunct. |
35 | | -
|
36 | | -The project is one of the better optimizers around according to [benchmark]. |
| 33 | +This open-source project was **inspired by _scikit-optimize_**. |
| 34 | +The project is one of the better optimizers available according to |
| 35 | +[benchmark](https://sambo-optimization.github.io/#benchmark). |
37 | 36 |
|
38 | 37 | \N{DAGGER} The contained algorithms seek to _minimize_ your objective `f(x)`. |
39 | 38 | If you instead need the _maximum_, simply minimize `-f(x)`. 💡 |
40 | 39 |
|
41 | | -[gaussian process]: https://www.gaussianprocess.org/gpml/chapters/RW.pdf |
42 | | -[extra trees]: https://doi.org/10.1007/s10994-006-6226-1 |
| 40 | +[Gaussian processes]: https://www.gaussianprocess.org/gpml/chapters/RW.pdf |
| 41 | +[Extra Trees]: https://doi.org/10.1007/s10994-006-6226-1 |
43 | 42 | [kernel ridge regression]: https://scikit-learn.org/stable/modules/kernel_ridge.html |
44 | 43 | [sp_opt_min]: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html |
45 | 44 | [skl_gridsearchcv]: https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.GridSearchCV.html |
|
0 commit comments