You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -23,6 +25,12 @@ In another [example](dp_mix.ipynb), we showed how to use Dirichlet processes to
23
25
Just as Dirichlet process mixtures can be thought of as infinite mixture models that select the number of active components as part of inference, dependent density regression can be thought of as infinite [mixtures of experts](https://en.wikipedia.org/wiki/Committee_machine) that select the active experts as part of inference. Their flexibility and modularity make them powerful tools for performing nonparametric Bayesian Data analysis.
24
26
25
27
```{code-cell} ipython3
28
+
---
29
+
colab:
30
+
base_uri: https://localhost:8080/
31
+
id: wSEx-eTag8LD
32
+
outputId: a962b5ff-d107-47f8-b413-5dc0480648bf
33
+
---
26
34
from io import StringIO
27
35
28
36
import arviz as az
@@ -40,17 +48,27 @@ print(f"Running on PyMC v{pm.__version__}")
40
48
```
41
49
42
50
```{code-cell} ipython3
51
+
:id: 0iVlIVjig8LE
52
+
43
53
%config InlineBackend.figure_format = 'retina'
44
54
plt.rc("animation", writer="ffmpeg")
45
55
blue, *_ = sns.color_palette()
46
56
az.style.use("arviz-darkgrid")
47
-
SEED = 972915 # from random.org; for reproducibility
57
+
SEED = 1972917 # from random.org; for reproducibility
48
58
np.random.seed(SEED)
49
59
```
50
60
61
+
+++ {"id": "3VHUk32Mg8LE"}
62
+
51
63
We will use the LIDAR data set from Larry Wasserman's excellent book, [_All of Nonparametric Statistics_](http://www.stat.cmu.edu/~larry/all-of-nonpar/). We standardize the data set to improve the rate of convergence of our samples.
This data set has a two interesting properties that make it useful for illustrating dependent density regression.
86
122
87
123
1. The relationship between range and log ratio is nonlinear, but has locally linear components.
@@ -90,6 +126,8 @@ This data set has a two interesting properties that make it useful for illustrat
90
126
The intuitive idea behind dependent density regression is to reduce the problem to many (related) density estimates, conditioned on fixed values of the predictors. The following animation illustrates this intuition.
As we slice the data with a window sliding along the x-axis in the left plot, the empirical distribution of the y-values of the points in the window varies in the right plot. An important aspect of this approach is that the density estimates that correspond to close values of the predictor are similar.
139
186
140
187
In the previous example, we saw that a Dirichlet process estimates a probability density as a mixture model with infinitely many components. In the case of normal component distributions,
@@ -155,37 +202,43 @@ where $\Phi$ is the cumulative distribution function of the standard normal dist
155
202
156
203
$$w_i\ |\ x = v_i\ |\ x \cdot \prod_{j = 1}^{i - 1} (1 - v_j\ |\ x).$$
157
204
158
-
For the LIDAR data set, we use independent normal priors $\alpha_i \sim N(0, 5^2)$ and $\beta_i \sim N(0, 5^2)$. We now express this this model for the conditional mixture weights using `PyMC3`.
205
+
For the LIDAR data set, we use independent normal priors $\alpha_i \sim N(0, 5^2)$ and $\beta_i \sim N(0, 5^2)$. We now express this this model for the conditional mixture weights using `PyMC`.
with pm.Model(coords={"N": np.arange(N), "K": np.arange(K) + 1}) as model:
230
+
alpha = pm.Normal("alpha", 0, 5, dims="K")
231
+
beta = pm.Normal("beta", 0, 5, dims="K")
232
+
x = pm.Data("x", std_range, dims="N")
233
+
v = norm_cdf(alpha + pt.outer(x, beta))
183
234
w = pm.Deterministic("w", stick_breaking(v), dims=["N", "K"])
184
235
```
185
236
186
-
We have defined `x` as a `pm.Data` container in order to use `PyMC3`'s posterior prediction capabilities later.
237
+
+++ {"id": "TKt9RzIVg8LF"}
238
+
239
+
We have defined `x` as a `pm.Data` container in order to use `PyMC`'s posterior prediction capabilities later.
187
240
188
-
While the dependent density regression model theoretically has infinitely many components, we must truncate the model to finitely many components (in this case, twenty) in order to express it using `PyMC3`. After sampling from the model, we will verify that truncation did not unduly influence our results.
241
+
While the dependent density regression model theoretically has infinitely many components, we must truncate the model to finitely many components (in this case, twenty) in order to express it using `PyMC`. After sampling from the model, we will verify that truncation did not unduly influence our results.
189
242
190
243
Since the LIDAR data seems to have several linear components, we use the linear models
We now sample from the dependent density regression model.
287
+
+++ {"id": "gUPThEEEg8LF"}
288
+
289
+
We now sample from the dependent density regression model using a Metropolis sampler. The default NUTS sampler has a difficult time sampling from this model, and the traceplots show poor convergence.
To verify that truncation did not unduly influence our results, we plot the largest posterior expected mixture weight for each component. (In this model, each point has a mixture weight for each component, so we plot the maximum mixture weight for each component across all data points in order to judge if the component exerts any influence on the posterior.)
Since only three mixture components have appreciable posterior expected weight for any data point, we can be fairly certain that truncation did not unduly influence our results. (If most components had appreciable posterior expected weight, truncation may have influenced the results, and we would have increased the number of components and sampled again.)
246
331
247
332
Visually, it is reasonable that the LIDAR data has three linear components, so these posterior expected weights seem to have identified the structure of the data well. We now sample from the posterior predictive distribution to get a better understand the model's performance.
The model has fit the linear components of the data well, and also accommodated its heteroskedasticity. This flexibility, along with the ability to modularly specify the conditional mixture weights and conditional component densities, makes dependent density regression an extremely useful nonparametric Bayesian model.
295
397
296
398
To learn more about dependent density regression and related models, consult [_Bayesian Data Analysis_](http://www.stat.columbia.edu/~gelman/book/), [_Bayesian Nonparametric Data Analysis_](http://www.springer.com/us/book/9783319189673), or [_Bayesian Nonparametrics_](https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=bayesian+nonparametrics+book).
297
399
298
400
This example first appeared [here](http://austinrochford.com/posts/2017-01-18-ddp-pymc3.html).
299
401
300
-
+++
402
+
+++ {"id": "CxDFNZDtg8LF"}
301
403
302
404
## Authors
303
405
* authored by Austin Rochford in 2017
304
406
* updated to PyMC v5 by Christopher Fonnesbeck in September2024
305
407
306
-
+++
408
+
+++ {"id": "e41HT-6Og8LF"}
307
409
308
410
## References
309
411
@@ -312,6 +414,12 @@ This example first appeared [here](http://austinrochford.com/posts/2017-01-18-dd
0 commit comments