You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/notebooks/synthetic-examples/01_linear_model.md
+5-28Lines changed: 5 additions & 28 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@
6
6
-**Description**: Apply global effect on a linear model
7
7
8
8
[](https://colab.research.google.com/github/givasile/effector/blob/main/synthetic-examples/01_linear_model.ipynb)
9
-
[](https://github.com/givasile/effector/blob/main/synthetic-examples/01_linear_model.ipynb)
9
+
[](https://github.com/givasile/effector/blob/main/notebooks/synthetic-examples/01_linear_model.ipynb)
10
10
11
11
12
12
```python
@@ -26,8 +26,6 @@ Hopefully, there is a quantity called **heterogeneity** that can be used to chec
26
26
27
27
`Effector` provides five different feature effect methods, which are summarized in the table below. In all methods, setting `heterogeneity=True` the methods show the level of heterogeneity, along with the average effect.
28
28
29
-
<center>
30
-
31
29
| Method | Description | API in `Effector`| Paper |
|`False` or `zero_start`| Don't enforce any additional centering | c=0 |
555
538
|`True` or `zero-integral`| Center around the $y$ axis | c=$\mathbb{E}_{x_s \sim \mathcal{U(x_{s,min},x_{s, max})}}[ALE(x_s)]$ |
556
539
557
-
</center>
558
-
559
540
Let's see how this works for $x_1$:
560
541
561
542
@@ -601,16 +582,12 @@ In summary, given a dataset `X: (N, D)` and a black-box model `model: (N, D) ->
601
582
the feature effect plot of the $s$-th feature `feature=s` is given with the table below.
602
583
The argument `confidence_interval=True|False` indicates whether to plot the standard deviation of the instance-level effects as $\pm$ interval around the feature effect plot. Some methods also require the gradient of the model `model_jac: (N, D) -> (N, D)`.
0 commit comments