You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+18-18Lines changed: 18 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -28,10 +28,10 @@ Idealy, we would use the information from prior model evaluations to guide us in
28
28
4. New parameter-score pairs are found
29
29
5. Repeat steps 2-4 until some stopping criteria is met
30
30
31
-
Graphical Intuition
32
-
-------------------
31
+
Bayesian Optimization Intuition
32
+
-------------------------------
33
33
34
-
As an example, let's say we are only tuning 1 hyperparameter in an xgboost model, min\_child weight in (0,1). We have initialized the process by randomly sampling the scoring function 6 times, and get the following results:
34
+
As an example, let's say we are only tuning 1 hyperparameter in an xgboost model, min\_child weight within the bounds \[0,1\]. We have initialized the process by randomly sampling the scoring function 6 times, and get the following results:
35
35
36
36
| min\_child\_weight| Score|
37
37
|-------------------:|----------:|
@@ -155,24 +155,24 @@ The console informs us that the process initialized by running `scoringFunction`
0 commit comments