Skip to content

Commit eed3af1

Browse files
OkonSamuelablaom
andauthored
Apply suggestions from code review
Co-authored-by: Anthony Blaom, PhD <[email protected]>
1 parent 81b90de commit eed3af1

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

README.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,8 +46,7 @@ rfe = RecursiveFeatureElimination(
4646
mach = machine(rfe, X, y)
4747
fit!(mach)
4848
```
49-
If we wish, we can get the feature importance scores, either by inspecting `report(mach)`
50-
or calling the `feature_importances` function on the fitted machine as shown below
49+
We can inspect the feature importances in two ways:
5150
```julia
5251
report(mach).ranking # returns [1.0, 1.0, 1.0, 1.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0]
5352
feature_importances(mach) # returns dict of feature => rank pairs
@@ -93,5 +92,7 @@ predict(self_tuning_rfe_mach, Xnew)
9392
In this case, prediction is done using the best recursive feature elimination model gotten
9493
from the tuning process above.
9594

96-
For more information various cross-validation strategies and `TunedModel` see
95+
For resampling methods different from cross-validation, and for other
96+
`TunedModel` options, such as parallelization, see the
97+
[Tuning Models](https://alan-turing-institute.github.io/MLJ.jl/dev/tuning_models/) section of the MLJ manual.
9798
[MLJ Documentation](https://alan-turing-institute.github.io/MLJ.jl/dev/)

0 commit comments

Comments
 (0)