You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+17-11Lines changed: 17 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,20 +19,24 @@ Pkg.add("FeatureSelection")
19
19
# Example Usage
20
20
Lets build a supervised recursive feature eliminator with `RandomForestRegressor`
21
21
from DecisionTree.jl as our base model.
22
-
But first we need a dataset to train on. We shall create a synthetic dataset popularly known in the R community as the friedman dataset#1. Notice how the target vector for this dataset depends on only the first
23
-
five columns of feature table. So we expect that our recursive feature elimination should return the first
24
-
columns as important features.
22
+
But first we need a dataset to train on. We shall create a synthetic dataset popularly
23
+
known in the R community as the friedman dataset#1. Notice how the target vector for this
24
+
dataset depends on only the first five columns of feature table. So we expect that our
25
+
recursive feature elimination should return the first columns as important features.
25
26
```julia
26
27
using MLJ # or, minimally, `using FeatureSelection, MLJModels, MLJBase`
0 commit comments