You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
miscellaneous functions for building predictive models, including
255
-
parameter tuning and variable importance measures.
253
+
parameter tuning and variable importance measures.
256
254
In a similar spirit, package `r pkg("mlr3")` offers high-level interfaces to
257
255
various statistical and machine learning packages. Package
258
256
`r pkg("SuperLearner")` implements a similar toolbox.
@@ -268,7 +266,7 @@ roughly structured into the following topics:
268
266
package `r pkg("rminer")` interfaces several learning
269
267
algorithms implemented in other packages and computes several
270
268
performance measures. Package `r pkg("qeML")` provides wrappers to numerous machine learning R packages with a simple, convenient, and uniform interface, for both
271
-
beginner and advanced operations such as `r pkg("FOCI")` and
269
+
beginner and advanced operations such as `r pkg("FOCI")` and
272
270
`r pkg("ncvreg")`.
273
271
-*Visualisation (initially contributed by Brandon Greenwell)* The
274
272
`stats::termplot()` function package can be used to plot the terms
@@ -301,23 +299,23 @@ roughly structured into the following topics:
301
299
constructed with the `partial()` function from the
302
300
`r pkg("pdp")` package.
303
301
-*XAI* : Most packages and functions from the last section "Visualization"
304
-
belong to the field of explainable artificial intelligence (XAI).
302
+
belong to the field of explainable artificial intelligence (XAI).
305
303
The meta packages `r pkg("DALEX")` and `r pkg("iml")` offer different
306
304
methods to interpret any model, including partial dependence,
307
-
accumulated local effects, and permutation importance. Accumulated local
305
+
accumulated local effects, and permutation importance. Accumulated local
308
306
effects plots are also directly available in `r pkg("ALEPlot")`.
309
307
SHAP (from *SH*apley *A*dditive ex*P*lanations) is one of the most
310
-
frequently used techniques to interpret ML models.
311
-
It decomposes - in a fair way - predictions into additive contributions
308
+
frequently used techniques to interpret ML models.
309
+
It decomposes - in a fair way - predictions into additive contributions
312
310
of the predictors. For tree-based models, the very fast TreeSHAP algorithm
313
311
exists. It is shipped directly with `r pkg("h2o")`, `r pkg("xgboost")`,
314
312
and `r pkg("lightgbm")`. Model-agnostic implementations of SHAP
315
313
are available in additional packages: `r pkg("fastshap")` mainly uses
316
-
Monte-Carlo sampling to approximate SHAP values, while `r pkg("shapr")` and
317
-
`r pkg("kernelshap")` provide implementations of KernelSHAP.
318
-
SHAP values of any of these packages can be plotted by the package `r pkg("shapviz")`.
319
-
A port to Python's "shap" package is provided in `r pkg("shapper")`.
320
-
Alternative decompositions of predictions are implemented in
314
+
Monte-Carlo sampling to approximate SHAP values, while `r pkg("shapr")` and
315
+
`r pkg("kernelshap")` provide implementations of KernelSHAP.
316
+
SHAP values of any of these packages can be plotted by the package `r pkg("shapviz")`.
317
+
A port to Python's "shap" package is provided in `r pkg("shapper")`.
318
+
Alternative decompositions of predictions are implemented in
0 commit comments