Skip to content

Commit d828f20

Browse files
authored
Merge pull request #16 from ngoix/references
links SkopeRules to existing literature
2 parents b78ef9d + 9bb24e1 commit d828f20

File tree

1 file changed

+21
-4
lines changed

1 file changed

+21
-4
lines changed

README.md

Lines changed: 21 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -29,12 +29,29 @@ class, i.e. detecting with high precision instances of this class.
2929

3030
See the `AUTHORS.rst <AUTHORS.rst>`_ file for a list of contributors.
3131

32+
Links with existing litterature
33+
------------------------------
3234

33-
Installation
34-
------------
35+
The main advantage of decision rules is that they are offering interpretable models. The problem of generating such rules has been widely considered in machine learning, see e.g. RuleFit [1], Slipper [2], LRI [3], MLRules[4].
36+
37+
A decision rule is a logical expression of the form "IF conditions THEN reponse". In a binary classification setting, if an instance satisfies conditions of the rule, then it is assigned to one of the two classes. If this instance does not satisfy conditions, it remains unassigned.
38+
39+
1) In [2, 3, 4], rules induction is done by considering each single decision rule as a base classifier in an ensemble, which is built by greedily minimizing some loss function.
40+
41+
2) In [1], rules are extracted from an ensemble of trees; a weighted combination of these rules is then built by solving a L1-regularized optimization problem over the weights as described in [5].
42+
43+
In this package, we use approach 2). Rules are extracting from tree ensemble, which allow us to take advantage of existing fast algorithms to produce such tree ensemble. Too similar or duplicated rules are then removed.
44+
The main goal of this package is to provide rules verifying precision and recall conditions. It still implement a score (`decision_function`) method, but which does not solve the L1-regularized optimization problem as in [1]. Instead, weights are simply proportional to the OOB associated precision of the rule.
45+
46+
47+
[1] Friedman and Popescu, Predictive learning via rule ensembles,Technical Report, 2005.
48+
[2] Cohen and Singer, A simple, fast, and effective rule learner, National Conference on Artificial Intelligence, 1999.
49+
[3] Weiss and Indurkhya, Lightweight rule induction, ICML, 2000.
50+
[4] Dembczyński, Kotłowski and Słowiński, Maximum Likelihood Rule Ensembles, ICML, 2008.
51+
[5] Friedman and Popescu, Gradient directed regularization, Technical Report, 2004.
3552

3653
Dependencies
37-
~~~~~~~~~~~~
54+
------------
3855

3956
skope-rules requires:
4057

@@ -47,7 +64,7 @@ skope-rules requires:
4764
For running the examples Matplotlib >= 1.1.1 is required.
4865

4966
Installation
50-
~~~~~~~~~~~~~~~~~
67+
------------
5168

5269
You can get the latest sources with the command::
5370

0 commit comments

Comments
 (0)