You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#' [`Graph`] that is being wrapped. This [`Graph`] contains a trained state after `$train()`. Read-only.
50
+
#' * `pipeops` :: named `list` of [`PipeOp`] \cr
51
+
#' Contains all [`PipeOp`]s in the underlying [`Graph`], named by the [`PipeOp`]'s `$id`s. Shortcut for `$graph_model$pipeops`. See [`Graph`] for details.
#' Parameters of the underlying [`Graph`]. Shortcut for `$graph$param_set`. See [`Graph`] for details.
56
+
#' * `pipeops_param_set` :: named `list()`\cr
57
+
#' Named list containing the [`ParamSet`][paradox::ParamSet]s of all [`PipeOp`]s in the [`Graph`]. See there for details.
58
+
#' * `pipeops_param_set_values` :: named `list()`\cr
59
+
#' Named list containing the set parameter values of all [`PipeOp`]s in the [`Graph`]. See there for details.
50
60
#' * `internal_tuned_values` :: named `list()` or `NULL`\cr
51
-
#' The internal tuned parameter values collected from all `PipeOp`s.
61
+
#' The internal tuned parameter values collected from all [`PipeOp`]s.
52
62
#' `NULL` is returned if the learner is not trained or none of the wrapped learners supports internal tuning.
53
63
#' * `internal_valid_scores` :: named `list()` or `NULL`\cr
54
-
#' The internal validation scores as retrieved from the `PipeOps`.
55
-
#' The names are prefixed with the respective IDs of the `PipeOp`s.
64
+
#' The internal validation scores as retrieved from the [`PipeOp`]s.
65
+
#' The names are prefixed with the respective IDs of the [`PipeOp`]s.
56
66
#' `NULL` is returned if the learner is not trained or none of the wrapped learners supports internal validation.
57
67
#' * `validate` :: `numeric(1)`, `"predefined"`, `"test"` or `NULL`\cr
58
-
#' How to construct the validation data. This also has to be configured for the individual `PipeOp`s such as
68
+
#' How to construct the validation data. This also has to be configured for the individual [`PipeOp`]s such as
59
69
#' `PipeOpLearner`, see [`set_validate.GraphLearner`].
60
70
#' For more details on the possible values, see [`mlr3::Learner`].
61
71
#' * `marshaled` :: `logical(1)`\cr
@@ -75,6 +85,16 @@
75
85
#'
76
86
#' @section Methods:
77
87
#' Methods inherited from [`Learner`][mlr3::Learner], as well as:
88
+
#' * `ids(sorted = FALSE)` \cr
89
+
#' (`logical(1)`) -> `character` \cr
90
+
#' Get IDs of all [`PipeOp`]s. This is in order that [`PipeOp`]s were added if
91
+
#' `sorted` is `FALSE`, and topologically sorted if `sorted` is `TRUE`.
92
+
#' * `plot(html = FALSE, horizontal = FALSE)` \cr
93
+
#' (`logical(1)`, `logical(1)`) -> `NULL` \cr
94
+
#' Plot the [`Graph`], using either the \pkg{igraph} package (for `html = FALSE`, default) or
95
+
#' the `visNetwork` package for `html = TRUE` producing a [`htmlWidget`][htmlwidgets::htmlwidgets].
96
+
#' The [`htmlWidget`][htmlwidgets::htmlwidgets] can be rescaled using [`visOptions`][visNetwork::visOptions].
97
+
#' For `html = FALSE`, the orientation of the plotted graph can be controlled through `horizontal`.
78
98
#' * `marshal`\cr
79
99
#' (any) -> `self`\cr
80
100
#' Marshal the model.
@@ -104,11 +124,11 @@
104
124
#' This works well for simple [`Graph`]s that do not modify features too much, but may give unexpected results for `Graph`s that
105
125
#' add new features or move information between features.
106
126
#'
107
-
#' As an example, consider a feature `A`` with missing values, and a feature `B`` that is used for imputatoin, using a [`po("imputelearner")`][PipeOpImputeLearner].
108
-
#' In a case where the following [`Learner`][mlr3::Learner] performs embedded feature selection and only selects feature A,
109
-
#' the `selected_features()` method could return only feature `A``, and `$importance()` may even report 0 for feature `B`.
110
-
#' This would not be entirbababababely accurate when considering the entire `GraphLearner`, as feature `B` is used for imputation and would therefore have an impact on predictions.
111
-
#' The following should therefore only be used if the `Graph` is known to not have an impact on the relevant properties.
127
+
#' As an example, consider a feature `A` with missing values, and a feature `B` that is used for imputation, using a [`po("imputelearner")`][PipeOpImputeLearner].
128
+
#' In a case where the following [`Learner`][mlr3::Learner] performs embedded feature selection and only selects feature `A`,
129
+
#' the `selected_features()` method could return only feature `A`, and `$importance()` may even report 0 for feature `B`.
130
+
#' This would not be entirely accurate when considering the entire `GraphLearner`, as feature `B` is used for imputation and would therefore have an impact on predictions.
131
+
#' The following should therefore only be used if the [`Graph`] is known to not have an impact on the relevant properties.
0 commit comments