Skip to content

Commit fe223d5

Browse files
committed
Update
1 parent 3000075 commit fe223d5

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

docs/_build/html/_sources/guides/explain/shap.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ The exact solution is obtained using the Shapley value formula, which requires e
3434

3535
.. math::
3636
\begin{align}
37-
\phi_{i}= \sum_{S \subseteq \{1, \ldots, p\} \{ i \}} \frac{|S|!(p-|S|-1)!}{p!}(val(S \cup \{i\}) - val(S)). \tag{2}
37+
\phi_{i}= \sum_{S \subseteq \{1, \ldots, p\} \backslash \{ i \}} \frac{|S|!(p-|S|-1)!}{p!}(val(S \cup \{i\}) - val(S)). \tag{2}
3838
\end{align}
3939
4040
where :math:`val` is the value function that returns the prediction of each coalition. The marginal contribution of feature :math:`i` to the coalition :math:`S` is calculated as the difference between the value of the coalition with the addition of feature :math:`i` and the value of the original coalition, i.e., :math:`val(S \cup \{i\}) - val(S)`. The term :math:`\frac{|S|!(p-|S|-1)!}{p!}` is a normalization factor. When the number of features is small, this exact estimation approach is acceptable. However, as the number of features increases, the exact solution may become problematic.

docs/_build/html/guides/explain/shap.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -230,7 +230,7 @@ <h3><span class="section-number">4.2.2.1.1. </span>Exact Solution<a class="heade
230230
<p>The exact solution is obtained using the Shapley value formula, which requires evaluating all possible coalitions of features with and without the <span class="math notranslate nohighlight">\(i\)</span>-th feature.</p>
231231
<div class="math notranslate nohighlight">
232232
\[\begin{align}
233-
\phi_{i}= \sum_{S \subseteq \{1, \ldots, p\} \{ i \}} \frac{|S|!(p-|S|-1)!}{p!}(val(S \cup \{i\}) - val(S)). \tag{2}
233+
\phi_{i}= \sum_{S \subseteq \{1, \ldots, p\} \backslash \{ i \}} \frac{|S|!(p-|S|-1)!}{p!}(val(S \cup \{i\}) - val(S)). \tag{2}
234234
\end{align}\]</div>
235235
<p>where <span class="math notranslate nohighlight">\(val\)</span> is the value function that returns the prediction of each coalition. The marginal contribution of feature <span class="math notranslate nohighlight">\(i\)</span> to the coalition <span class="math notranslate nohighlight">\(S\)</span> is calculated as the difference between the value of the coalition with the addition of feature <span class="math notranslate nohighlight">\(i\)</span> and the value of the original coalition, i.e., <span class="math notranslate nohighlight">\(val(S \cup \{i\}) - val(S)\)</span>. The term <span class="math notranslate nohighlight">\(\frac{|S|!(p-|S|-1)!}{p!}\)</span> is a normalization factor. When the number of features is small, this exact estimation approach is acceptable. However, as the number of features increases, the exact solution may become problematic.</p>
236236
<p>It’s worth noting that the value function <span class="math notranslate nohighlight">\(val\)</span> takes the feature coalition <span class="math notranslate nohighlight">\(S\)</span> as input. However, in machine learning models, the prediction is not solely based on the feature coalition but on the entire feature vector. Therefore, we need to specify how removing a feature from the feature vector affects the prediction. Two common approaches are available, both of which depend on a pre-defined background distribution instead of merely replacing the “missing” features with a fixed value.</p>

0 commit comments

Comments
 (0)