You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.Rmd
+48-39Lines changed: 48 additions & 39 deletions
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ A new R6 and much more modular implementation for single- and multi-objective Ba
22
22
23
23
## Get Started
24
24
25
-
An overview and gentle introduction is given in [this vignette](https://mlr3mbo.mlr-org.com/dev/articles/mlr3mbo.html).
25
+
The best entry point to get familiar with `mlr3mbo`is provided via the [Bayesian Optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-bayesian-optimization) chapter in the `mlr3book`.
26
26
27
27
## Design
28
28
@@ -32,7 +32,7 @@ An overview and gentle introduction is given in [this vignette](https://mlr3mbo.
32
32
*`AcqFunction`: Acquisition Function
33
33
*`AcqOptimizer`: Acquisition Function Optimizer
34
34
35
-
Based on these, Bayesian Optimization loops can be written, see, e.g., `bayesopt_ego` for sequential single-objective BO.
35
+
Based on these, Bayesian Optimization (BO) loops can be written, see, e.g., `bayesopt_ego` for sequential single-objective BO.
36
36
37
37
`mlr3mbo` also provides an `OptimizerMbo` class behaving like any other `Optimizer` from the [bbotk](https://cran.r-project.org/package=bbotk) package as well as
38
38
a `TunerMbo` class behaving like any other `Tuner` from the [mlr3tuning](https://cran.r-project.org/package=mlr3tuning) package.
@@ -42,66 +42,75 @@ See `?mbo_defaults` for more details.
42
42
43
43
## Simple Optimization Example
44
44
45
-
Minimize `f(x) = x^2`via sequential single-objective BO using a GP as surrogate and EI optimized via random search as acquisition function:
45
+
Minimize the two-dimensional Branin function via sequential BO using a GP as surrogate and EI as acquisition function optimized via a local serch:
46
46
47
47
```{r, message = FALSE}
48
48
library(bbotk)
49
49
library(mlr3mbo)
50
50
library(mlr3learners)
51
51
set.seed(1)
52
52
53
-
obfun = ObjectiveRFun$new(
54
-
fun = function(xs) list(y1 = xs$x ^ 2),
55
-
domain = ps(x = p_dbl(lower = -10, upper = 10)),
56
-
codomain = ps(y1 = p_dbl(tags = "minimize")))
53
+
fun = function(xdt) {
54
+
y = branin(xdt[["x1"]], xdt[["x2"]])
55
+
data.table(y = y)
56
+
}
57
+
58
+
domain = ps(
59
+
x1 = p_dbl(-5, 10),
60
+
x2 = p_dbl(0, 15)
61
+
)
62
+
63
+
codomain = ps(
64
+
y = p_dbl(tags = "minimize")
65
+
)
66
+
67
+
objective = ObjectiveRFunDt$new(
68
+
fun = fun,
69
+
domain = domain,
70
+
codomain = codomain
71
+
)
57
72
58
73
instance = oi(
59
-
objective = obfun,
60
-
terminator = trm("evals", n_evals = 10))
74
+
objective = objective,
75
+
terminator = trm("evals", n_evals = 25)
76
+
)
61
77
62
78
surrogate = srlrn(lrn("regr.km", control = list(trace = FALSE)))
Note that you can also use `bb_optimize`as a shorthand:
97
+
We can quickly visualize the contours of the objective function (on log scale) as well as the sampling behavior of our BO run (lighter blue colours indicating points that were evaluated in later stages of the optimization process; the first batch is given by the initial design).
77
98
78
-
```{r, message = FALSE}
79
-
library(bbotk)
80
-
library(mlr3mbo)
81
-
library(mlr3learners)
82
-
set.seed(1)
83
-
84
-
fun = function(xs) list(y1 = xs$x ^ 2)
85
-
86
-
surrogate = srlrn(lrn("regr.km", control = list(trace = FALSE)))
0 commit comments