Skip to content

Commit 8c48764

Browse files
committed
Move Stacking documentation
1 parent c5cf826 commit 8c48764

File tree

5 files changed

+41
-0
lines changed

5 files changed

+41
-0
lines changed

doc/visual-programming/source/index.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -96,6 +96,7 @@ Model
9696
widgets/model/adaboost
9797
widgets/model/neuralnetwork
9898
widgets/model/stochasticgradient
99+
widgets/model/stacking
99100
widgets/model/loadmodel
100101
widgets/model/savemodel
101102

2.24 KB
Loading
146 KB
Loading
7.71 KB
Loading
Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
Stacking
2+
========
3+
4+
Stack multiple models.
5+
6+
Inputs
7+
Data
8+
input dataset
9+
Preprocessor
10+
preprocessing method(s)
11+
Learners
12+
learning algorithm
13+
Aggregate
14+
model aggregation method
15+
16+
Outputs
17+
Learner
18+
aggregated (stacked) learning algorithm
19+
Model
20+
trained model
21+
22+
23+
**Stacking** is an ensemble method that computes a meta model from several base models. The **Stacking** widget has the **Aggregate** input, which provides a method for aggregating the input models. If no aggregation input is given the default methods are used. Those are **Logistic Regression** for classification and **Ridge Regression** for regression problems.
24+
25+
.. figure:: images/Stacking-stamped.png
26+
:scale: 50%
27+
28+
1. The meta learner can be given a name under which it will appear in other widgets. The default name is “Stack”.
29+
2. Click *Apply* to commit the aggregated model. That will put the new learner in the output and, if the training examples are given, construct a new model and output it as well. To communicate changes automatically tick *Apply Automatically*.
30+
3. Access help and produce a report.
31+
32+
Example
33+
-------
34+
35+
We will use **Paint Data** to demonstrate how the widget is used. We painted a complex dataset with 4 class labels and sent it to **Test & Score**. We also provided three **kNN** learners, each with a different parameters (number of neighbors is 5, 10 or 15). Evaluation results are good, but can we do better?
36+
37+
Let's use **Stacking**. **Stacking** requires several learners on the input and an aggregation method. In our case, this is **Logistic Regression**. A constructed meta learner is then sent to **Test & Score**. Results have improved, even if only marginally. **Stacking** normally works well on complex data sets.
38+
39+
.. figure:: images/Stacking-Example.png
40+

0 commit comments

Comments
 (0)