@@ -456,7 +456,8 @@ Train the machine using `fit!(mach, rows=...)`.
456456
457457- `display_depth=5`: max depth to show when displaying the tree
458458
459- - `feature_importance`: method to use for computing feature importances. One of `(:impurity, :split)`
459+ - `feature_importance`: method to use for computing feature importances. One of `(:impurity,
460+ :split)`
460461
461462- `rng=Random.GLOBAL_RNG`: random number generator or seed
462463
@@ -591,7 +592,8 @@ Train the machine with `fit!(mach, rows=...)`.
591592
592593- `sampling_fraction=0.7` fraction of samples to train each tree on
593594
594- - `feature_importance`: method to use for computing feature importances. One of `(:impurity, :split)`
595+ - `feature_importance`: method to use for computing feature importances. One of `(:impurity,
596+ :split)`
595597
596598- `rng=Random.GLOBAL_RNG`: random number generator or seed
597599
@@ -613,6 +615,11 @@ The fields of `fitted_params(mach)` are:
613615- `forest`: the `Ensemble` object returned by the core DecisionTree.jl algorithm
614616
615617
618+ # Report
619+
620+ - `features`: the names of the features encountered in training
621+
622+
616623# Examples
617624
618625```
@@ -632,6 +639,11 @@ predict_mode(mach, Xnew) # point predictions
632639pdf.(yhat, "virginica") # probabilities for the "verginica" class
633640
634641fitted_params(mach).forest # raw `Ensemble` object from DecisionTrees.jl
642+
643+ feature_importances(mach) # `:impurity` feature importances
644+ forest.feature_importance = :split
645+ feature_importance(mach) # `:split` feature importances
646+
635647```
636648See also
637649[DecisionTree.jl](https://github.com/bensadeghi/DecisionTree.jl) and
@@ -692,6 +704,12 @@ The fields of `fitted_params(mach)` are:
692704
693705- `coefficients`: the stump coefficients (one per stump)
694706
707+
708+ # Report
709+
710+ - `features`: the names of the features encountered in training
711+
712+
695713```
696714using MLJ
697715Booster = @load AdaBoostStumpClassifier pkg=DecisionTree
@@ -781,6 +799,11 @@ The fields of `fitted_params(mach)` are:
781799 DecisionTree.jl algorithm
782800
783801
802+ # Report
803+
804+ - `features`: the names of the features encountered in training
805+
806+
784807# Examples
785808
786809```
@@ -864,6 +887,11 @@ The fields of `fitted_params(mach)` are:
864887- `forest`: the `Ensemble` object returned by the core DecisionTree.jl algorithm
865888
866889
890+ # Report
891+
892+ - `features`: the names of the features encountered in training
893+
894+
867895# Examples
868896
869897```
0 commit comments