Skip to content

Commit d2e41a5

Browse files
OkonSamuelablaom
andauthored
Apply suggestions from code review
Co-authored-by: Anthony Blaom, PhD <[email protected]>
1 parent 9351dcd commit d2e41a5

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,12 @@ Pkg.add("FeatureSelection")
1818

1919
# Example Usage
2020
Lets build a supervised recursive feature eliminator with `RandomForestRegressor` from `MLJDecisionTreeInterface` as our base model.
21-
But first we need a dataset to train on. We shall create a synthetic dataset popularly known in the R community as the friedman dataset#1. Notice how the tart vector for this dataset depends on only the first
21+
But first we need a dataset to train on. We shall create a synthetic dataset popularly known in the R community as the friedman dataset#1. Notice how the target vector for this dataset depends on only the first
2222
five columns of feature table. So we expect that our recursive feature elimination should return the first
2323
columns as important features.
2424
```julia
25-
using FeatureSelection, MLJ, StableRNGs, MLJDecisionTreeInterface
25+
using MLJ # or, minimally, `using FeatureSelection, MLJModels, MLJBase`
26+
using StableRNGs
2627
rng = StableRNG(123)
2728
A = rand(rng, 50, 10)
2829
X = MLJ.table(A) # features
@@ -32,6 +33,7 @@ y = @views(
3233
```
3334
Now we that we have our data we can create our recursive feature elimination model and train it on our dataset
3435
```julia
36+
RandomForestRegressor = @load RandomForestRegressor pkg=DecisionTree
3537
forest = RandomForestRegressor()
3638
rfe = RecursiveFeatureElimination(
3739
model = forest, n_features_to_select=5, step=1

0 commit comments

Comments
 (0)