Skip to content

Commit 7392d23

Browse files
Improved value fields and examples of exported functions
1 parent 0c7bef7 commit 7392d23

20 files changed

+342
-97
lines changed

R/addIterations.R

Lines changed: 33 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -15,21 +15,39 @@
1515
#' at each Epoch (optimization step). If running in parallel, good practice
1616
#' is to set \code{iters.k} to some multiple of the number of cores you have designated
1717
#' for this process. Must belower than, and preferrably some multiple of \code{iters.n}.
18-
#' @param otherHalting Same as bayesOpt()
19-
#' @param bounds Same as bayesOpt()
20-
#' @param acq Same as bayesOpt()
21-
#' @param kappa Same as bayesOpt()
22-
#' @param eps Same as bayesOpt()
23-
#' @param gsPoints Same as bayesOpt()
24-
#' @param convThresh Same as bayesOpt()
25-
#' @param acqThresh Same as bayesOpt()
26-
#' @param errorHandling Same as bayesOpt()
27-
#' @param saveFile Same as bayesOpt()
28-
#' @param parallel Same as bayesOpt()
29-
#' @param plotProgress Same as bayesOpt()
30-
#' @param verbose Same as bayesOpt()
31-
#' @param ... Same as bayesOpt()
32-
#' @return A \code{bayesOpt} object.
18+
#' @param otherHalting Same as \code{bayesOpt()}
19+
#' @param bounds Same as \code{bayesOpt()}
20+
#' @param acq Same as \code{bayesOpt()}
21+
#' @param kappa Same as \code{bayesOpt()}
22+
#' @param eps Same as \code{bayesOpt()}
23+
#' @param gsPoints Same as \code{bayesOpt()}
24+
#' @param convThresh Same as \code{bayesOpt()}
25+
#' @param acqThresh Same as \code{bayesOpt()}
26+
#' @param errorHandling Same as \code{bayesOpt()}
27+
#' @param saveFile Same as \code{bayesOpt()}
28+
#' @param parallel Same as \code{bayesOpt()}
29+
#' @param plotProgress Same as \code{bayesOpt()}
30+
#' @param verbose Same as \code{bayesOpt()}
31+
#' @param ... Same as \code{bayesOpt()}
32+
#' @return An object of class \code{bayesOpt} having run additional iterations.
33+
#' @examples
34+
#' scoringFunction <- function(x) {
35+
#' a <- exp(-(2-x)^2)*1.5
36+
#' b <- exp(-(4-x)^2)*2
37+
#' c <- exp(-(6-x)^2)*1
38+
#' return(list(Score = a+b+c))
39+
#' }
40+
#'
41+
#' bounds <- list(x = c(0,8))
42+
#'
43+
#' Results <- bayesOpt(
44+
#' FUN = scoringFunction
45+
#' , bounds = bounds
46+
#' , initPoints = 3
47+
#' , iters.n = 1
48+
#' , gsPoints = 10
49+
#' )
50+
#' Results <- addIterations(Results,iters.n=1)
3351
#' @export
3452
addIterations <- function(
3553
optObj

R/bayesOpt.R

Lines changed: 16 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,22 @@
8585
#' @param ... Other parameters passed to \code{DiceKriging::km()}. All FUN inputs and scores
8686
#' are scaled from 0-1 before being passed to km. FUN inputs are scaled within \code{bounds},
8787
#' and scores are scaled by 0 = min(scores), 1 = max(scores).
88-
#' @return A \code{bayesOpt} object, containing information about the process.
88+
#' @return An object of class \code{bayesOpt} containing information about the process.
89+
#' \itemize{
90+
#' \item \code{FUN} The scoring function.
91+
#' \item \code{bounds} The bounds originally supplied.
92+
#' \item \code{iters} The total iterations that have been run.
93+
#' \item \code{initPars} The initialization parameters.
94+
#' \item \code{optPars} The optimization parameters.
95+
#' \item \code{GauProList} A list containing information on the Gaussian Processes used in optimization.
96+
#' \item \code{scoreSummary} A \code{data.table} with results from the execution of \code{FUN}
97+
#' at different inputs. Includes information on the epoch, iteration, function inputs, score, and any other
98+
#' information returned by \code{FUN}.
99+
#' \item \code{stopStatus} Information on what caused the function to stop running. Possible explenations are
100+
#' time limit, minimum utility not met, errors in \code{FUN}, iters.n was reached, or the Gaussian Process encountered
101+
#' an error.
102+
#' \item \code{elapsedTime} The total time in seconds the function was executing.
103+
#' }
89104
#' @references Jasper Snoek, Hugo Larochelle, Ryan P. Adams (2012) \emph{Practical Bayesian Optimization of Machine Learning Algorithms}
90105
#'
91106
#' @section Vignettes:

R/calcAcq.R

Lines changed: 0 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,4 @@
1-
#' @title Calculate Acquisition Function
2-
#'
3-
#' @description
4-
#' Function to be Maximized
5-
#'
6-
#' @param par Parameter set to predict
7-
#' @param GPs an object of class gp
8-
#' @param GPe an object of class gp
9-
#' @param acq Acquisition function type to be used
10-
#' @param y_max The current maximum known value of the target utility function
11-
#' @param kappa tunable parameter kappa to balance exploitation against exploration
12-
#' @param eps tunable parameter epsilon to balance exploitation against exploration
131
#' @importFrom stats dnorm pnorm predict
14-
#' @return The acquisition function value.
152
#' @keywords internal
163
calcAcq <- function(par, scoreGP, timeGP, acq, y_max, kappa, eps) {
174

R/changeSaveFile.R

Lines changed: 22 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,28 @@
44
#' @param optObj An object of class bayesOpt
55
#' @param saveFile A filepath stored as a character. Must include the
66
#' filename and extension as a .RDS.
7-
#' @return The same optObj with the updated saveFile.
7+
#' @return The same \code{optObj} with the updated saveFile.
8+
#' @examples
9+
#' \dontrun{
10+
#' scoringFunction <- function(x) {
11+
#' a <- exp(-(2-x)^2)*1.5
12+
#' b <- exp(-(4-x)^2)*2
13+
#' c <- exp(-(6-x)^2)*1
14+
#' return(list(Score = a+b+c))
15+
#' }
16+
#'
17+
#' bounds <- list(x = c(0,8))
18+
#'
19+
#' Results <- bayesOpt(
20+
#' FUN = scoringFunction
21+
#' , bounds = bounds
22+
#' , initPoints = 3
23+
#' , iters.n = 2
24+
#' , gsPoints = 10
25+
#' , saveFile = "filepath.RDS"
26+
#' )
27+
#' Results <- changeSaveFile(Results,saveFile = "DifferentFile.RDS")
28+
#' }
829
#' @export
930
changeSaveFile <- function(optObj,saveFile = NULL) {
1031

R/getBestPars.R

Lines changed: 23 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,30 @@
11
#' Get the Best Parameter Set
22
#'
3-
#' Returns the parameter set which resulted in the maximum score from \code{FUN}.
3+
#' Returns the N parameter sets which resulted in the maximum scores from \code{FUN}.
44
#'
5-
#' If N > 1, a data.table with N rows is returned, order by score decreasing.
6-
#' If N = 1, a list of parameters is returned.
7-
#'
8-
#' @param optObj An object of class bayesOpt
5+
#' @param optObj An object of class \code{bayesOpt}
96
#' @param N The number of parameter sets to return
7+
#' @return A list containing the \code{FUN} inputs which resulted in the highest returned Score.
8+
#' If N > 1, a \code{data.table} is returned. Each row is a result from \code{FUN}, with results ordered by
9+
#' descending Score.
10+
#' @examples
11+
#' scoringFunction <- function(x) {
12+
#' a <- exp(-(2-x)^2)*1.5
13+
#' b <- exp(-(4-x)^2)*2
14+
#' c <- exp(-(6-x)^2)*1
15+
#' return(list(Score = a+b+c))
16+
#' }
17+
#'
18+
#' bounds <- list(x = c(0,8))
19+
#'
20+
#' Results <- bayesOpt(
21+
#' FUN = scoringFunction
22+
#' , bounds = bounds
23+
#' , initPoints = 3
24+
#' , iters.n = 2
25+
#' , gsPoints = 10
26+
#' )
27+
#' print(getBestPars(Results))
1028
#' @export
1129
getBestPars <- function(
1230
optObj

R/getLocalOptimums.R

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
#' dbscan is then used to cluster points together which converged to the same
88
#' optimum - only unique optimums are returned.
99
#'
10-
#' @param optObj an object of class bayesOpt. The following parameters are all defaulted to
10+
#' @param optObj an object of class \code{bayesOpt}. The following parameters are all defaulted to
1111
#' the options provided in this object, but can be manually specified.
1212
#' @param bounds Same as in \code{bayesOpt()}
1313
#' @param acq Same as in \code{bayesOpt()}
@@ -20,6 +20,24 @@
2020
#' @return A data table of local optimums, including the utility (gpUtility), the
2121
#' utility relative to the max utility (relUtility), and the steps taken in the
2222
#' L-BFGS-B method (gradCount).
23+
#' @examples
24+
#' scoringFunction <- function(x) {
25+
#' a <- exp(-(2-x)^2)*1.5
26+
#' b <- exp(-(4-x)^2)*2
27+
#' c <- exp(-(6-x)^2)*1
28+
#' return(list(Score = a+b+c))
29+
#' }
30+
#'
31+
#' bounds <- list(x = c(0,8))
32+
#'
33+
#' Results <- bayesOpt(
34+
#' FUN = scoringFunction
35+
#' , bounds = bounds
36+
#' , initPoints = 3
37+
#' , iters.n = 2
38+
#' , gsPoints = 10
39+
#' )
40+
#' print(getLocalOptimums(Results))
2341
#' @importFrom stats optim
2442
#' @importFrom data.table as.data.table
2543
#' @import foreach

R/plot.R

Lines changed: 20 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,26 @@
88
#' @importFrom ggplot2 ggplot aes_string xlab scale_color_discrete geom_point theme guides guide_legend margin element_text unit xlim ylab
99
#' @importFrom ggpubr ggarrange annotate_figure text_grob
1010
#' @importFrom graphics plot
11-
#' @return an object of class ggarrange
11+
#' @return an object of class \code{ggarrange} from the \code{ggpubr} package.
12+
#' @examples
13+
#' scoringFunction <- function(x) {
14+
#' a <- exp(-(2-x)^2)*1.5
15+
#' b <- exp(-(4-x)^2)*2
16+
#' c <- exp(-(6-x)^2)*1
17+
#' return(list(Score = a+b+c))
18+
#' }
19+
#'
20+
#' bounds <- list(x = c(0,8))
21+
#'
22+
#' Results <- bayesOpt(
23+
#' FUN = scoringFunction
24+
#' , bounds = bounds
25+
#' , initPoints = 3
26+
#' , iters.n = 2
27+
#' , gsPoints = 10
28+
#' )
29+
#' # This plot will also show in real time with parameter plotProgress = TRUE in bayesOpt()
30+
#' plot(Results)
1231
#' @export
1332
plot.bayesOpt <- function(x,...) {
1433

R/updateGP.R

Lines changed: 24 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,35 @@
22
#'
33
#' To save time, Gaussian processes are not updated after the last iteration
44
#' in \code{addIterations()}. The user can do this manually, using this function
5-
#' if they wish.
5+
#' if they wish. This is not necessary to continue optimization using \code{addIterations}.
66
#' @param optObj an object of class bayesOpt
77
#' @param bounds The bounds to scale the parameters within.
88
#' @param verbose Should the user be warned if the GP is already up to date?
99
#' @param ... passed to \code{DiceKriging::km()}
1010
#' @importFrom DiceKriging km
11-
#' @return a \code{bayesOpt} object with updated Gaussian Processes.
11+
#' @return An object of class \code{bayesOpt} with updated Gaussian processes.
12+
#' @examples
13+
#' # Create initial object
14+
#' scoringFunction <- function(x) {
15+
#' a <- exp(-(2-x)^2)*1.5
16+
#' b <- exp(-(4-x)^2)*2
17+
#' c <- exp(-(6-x)^2)*1
18+
#' return(list(Score = a+b+c))
19+
#' }
20+
#'
21+
#' bounds <- list(x = c(0,8))
22+
#'
23+
#' Results <- bayesOpt(
24+
#' FUN = scoringFunction
25+
#' , bounds = bounds
26+
#' , initPoints = 3
27+
#' , iters.n = 2
28+
#' , gsPoints = 10
29+
#' )
30+
#'
31+
#' # At this point, the Gaussian Process has not been updated
32+
#' # with the most recent results. We can update it manually:
33+
#' Results <- updateGP(Results)
1234
#' @export
1335
updateGP <- function(optObj,bounds = optObj$bounds,verbose = 1, ...) {
1436

cran-comments.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,4 +14,10 @@ There were no errors or notes. Only warnings explained that I am the maintainer
1414
There are no downstream dependencies.
1515

1616
## Changes
17-
Removed Plotly from dependencies. Some suggested packages are now used conditionally in vignettes, reade, tests and examples since they might not be available on all checking machines.
17+
#### Meta
18+
Package was removed because suggested package was not available on checking machine, which threw a warning when vignettes were built. Made vignettes and examples execution conditional on availability of suggested package. This doesn't affect the readability or educational value of the vignettes or examples.
19+
20+
#### Documentation
21+
* Added missing value fields to .Rd files of exported functions, and improved the documentation of existing value fields.
22+
* Added testable examples to all exported functions that were missing any.
23+
* Reset any options that were changed by vignettes.

man/addIterations.Rd

Lines changed: 34 additions & 15 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)