Skip to content

Commit b83b18f

Browse files
committed
orthogonal doc string updates (while I'm here)
1 parent a80f7a3 commit b83b18f

File tree

1 file changed

+48
-10
lines changed

1 file changed

+48
-10
lines changed

src/MLJDecisionTreeInterface.jl

Lines changed: 48 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -585,7 +585,7 @@ where
585585
Train the machine using `fit!(mach, rows=...)`.
586586
587587
588-
# Hyper-parameters
588+
# Hyperparameters
589589
590590
- `max_depth=-1`: max depth of the decision tree (-1=any)
591591
@@ -650,6 +650,11 @@ The fields of `report(mach)` are:
650650
- `features`: the names of the features encountered in training, in an
651651
order consistent with the output of `print_tree` (see below)
652652
653+
# Accessor functions
654+
655+
- `feature_importances(mach)` returns a vector of `(feature::Symbol => importance)` pairs;
656+
the type of importance is determined by the hyperparameter `feature_importance` (see
657+
above)
653658
654659
# Examples
655660
@@ -718,7 +723,7 @@ where
718723
Train the machine with `fit!(mach, rows=...)`.
719724
720725
721-
# Hyper-parameters
726+
# Hyperparameters
722727
723728
- `max_depth=-1`: max depth of the decision tree (-1=any)
724729
@@ -763,6 +768,13 @@ The fields of `fitted_params(mach)` are:
763768
- `features`: the names of the features encountered in training
764769
765770
771+
# Accessor functions
772+
773+
- `feature_importances(mach)` returns a vector of `(feature::Symbol => importance)` pairs;
774+
the type of importance is determined by the hyperparameter `feature_importance` (see
775+
above)
776+
777+
766778
# Examples
767779
768780
```
@@ -819,7 +831,7 @@ where:
819831
Train the machine with `fit!(mach, rows=...)`.
820832
821833
822-
# Hyper-parameters
834+
# Hyperparameters
823835
824836
- `n_iter=10`: number of iterations of AdaBoost
825837
@@ -853,6 +865,15 @@ The fields of `fitted_params(mach)` are:
853865
- `features`: the names of the features encountered in training
854866
855867
868+
# Accessor functions
869+
870+
- `feature_importances(mach)` returns a vector of `(feature::Symbol => importance)` pairs;
871+
the type of importance is determined by the hyperparameter `feature_importance` (see
872+
above)
873+
874+
875+
# Examples
876+
856877
```
857878
using MLJ
858879
Booster = @load AdaBoostStumpClassifier pkg=DecisionTree
@@ -871,6 +892,7 @@ pdf.(yhat, "virginica") # probabilities for the "verginica" class
871892
872893
fitted_params(mach).stumps # raw `Ensemble` object from DecisionTree.jl
873894
fitted_params(mach).coefs # coefficient associated with each stump
895+
feature_importances(mach)
874896
```
875897
876898
See also
@@ -905,7 +927,7 @@ where
905927
Train the machine with `fit!(mach, rows=...)`.
906928
907929
908-
# Hyper-parameters
930+
# Hyperparameters
909931
910932
- `max_depth=-1`: max depth of the decision tree (-1=any)
911933
@@ -949,6 +971,13 @@ The fields of `fitted_params(mach)` are:
949971
- `features`: the names of the features encountered in training
950972
951973
974+
# Accessor functions
975+
976+
- `feature_importances(mach)` returns a vector of `(feature::Symbol => importance)` pairs;
977+
the type of importance is determined by the hyperparameter `feature_importance` (see
978+
above)
979+
980+
952981
# Examples
953982
954983
```
@@ -1012,24 +1041,25 @@ where
10121041
Train the machine with `fit!(mach, rows=...)`.
10131042
10141043
1015-
# Hyper-parameters
1044+
# Hyperparameters
10161045
1017-
- `max_depth=-1`: max depth of the decision tree (-1=any)
1046+
- `max_depth=-1`: max depth of the decision tree (-1=any)
10181047
1019-
- `min_samples_leaf=1`: min number of samples each leaf needs to have
1048+
- `min_samples_leaf=1`: min number of samples each leaf needs to have
10201049
1021-
- `min_samples_split=2`: min number of samples needed for a split
1050+
- `min_samples_split=2`: min number of samples needed for a split
10221051
10231052
- `min_purity_increase=0`: min purity needed for a split
10241053
10251054
- `n_subfeatures=-1`: number of features to select at random (0 for all,
10261055
-1 for square root of number of features)
10271056
1028-
- `n_trees=10`: number of trees to train
1057+
- `n_trees=10`: number of trees to train
10291058
10301059
- `sampling_fraction=0.7` fraction of samples to train each tree on
10311060
1032-
- `feature_importance`: method to use for computing feature importances. One of `(:impurity, :split)`
1061+
- `feature_importance`: method to use for computing feature importances. One of
1062+
`(:impurity, :split)`
10331063
10341064
- `rng=Random.GLOBAL_RNG`: random number generator or seed
10351065
@@ -1052,6 +1082,13 @@ The fields of `fitted_params(mach)` are:
10521082
- `features`: the names of the features encountered in training
10531083
10541084
1085+
# Accessor functions
1086+
1087+
- `feature_importances(mach)` returns a vector of `(feature::Symbol => importance)` pairs;
1088+
the type of importance is determined by the hyperparameter `feature_importance` (see
1089+
above)
1090+
1091+
10551092
# Examples
10561093
10571094
```
@@ -1066,6 +1103,7 @@ Xnew, _ = make_regression(3, 2)
10661103
yhat = predict(mach, Xnew) # new predictions
10671104
10681105
fitted_params(mach).forest # raw `Ensemble` object from DecisionTree.jl
1106+
feature_importances(mach)
10691107
```
10701108
10711109
See also

0 commit comments

Comments
 (0)