Skip to content

Commit 3f8d852

Browse files
committed
minor spacing updates
1 parent b178f53 commit 3f8d852

File tree

1 file changed

+5
-15
lines changed

1 file changed

+5
-15
lines changed

lectures/ar1_bayes.md

Lines changed: 5 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -37,15 +37,12 @@ import matplotlib.pyplot as plt
3737

3838
This lecture uses Bayesian methods offered by [`numpyro`](https://num.pyro.ai/en/stable/) to make statistical inferences about two parameters of a univariate first-order autoregression.
3939

40-
4140
The model is a good laboratory for illustrating the
4241
consequences of alternative ways of modeling the distribution of the initial $y_0$:
4342

4443
- As a fixed number
45-
4644
- As a random variable drawn from the stationary distribution of the $\{y_t\}$ stochastic process
4745

48-
4946
The first component of the statistical model is
5047

5148
$$
@@ -62,8 +59,6 @@ $$
6259
y_0 \sim {\mathcal{N}}(\mu_0, \sigma_0^2)
6360
$$ (eq:themodel_2)
6461
65-
66-
6762
Consider a sample $\{y_t\}_{t=0}^T$ governed by this statistical model.
6863
6964
The model implies that the likelihood function of $\{y_t\}_{t=0}^T$ can be *factored*:
@@ -88,7 +83,6 @@ We want to study how inferences about the unknown parameters $(\rho, \sigma_x)$
8883
Below, we study two widely used alternative assumptions:
8984
9085
- $(\mu_0,\sigma_0) = (y_0, 0)$ which means that $y_0$ is drawn from the distribution ${\mathcal N}(y_0, 0)$; in effect, we are *conditioning on an observed initial value*.
91-
9286
- $\mu_0,\sigma_0$ are functions of $\rho, \sigma_x$ because $y_0$ is drawn from the stationary distribution implied by $\rho, \sigma_x$.
9387
9488
@@ -125,17 +119,13 @@ Basically, when $y_0$ happens to be in the tail of the stationary distribution a
125119
126120
An example below shows how not conditioning on $y_0$ adversely shifts the posterior probability distribution of $\rho$ toward larger values.
127121
128-
129122
We begin by solving a *direct problem* that simulates an AR(1) process.
130123
131124
How we select the initial value $y_0$ matters:
132125
133-
* If we think $y_0$ is drawn from the stationary distribution ${\mathcal N}(0, \frac{\sigma_x^{2}}{1-\rho^2})$, then it is a good idea to use this distribution as $f(y_0)$.
134-
135-
- Why? Because $y_0$ contains information about $\rho, \sigma_x$.
136-
137-
* If we suspect that $y_0$ is far in the tail of the stationary distribution -- so that variation in early observations in the sample has a significant *transient component* -- it is better to condition on $y_0$ by setting $f(y_0) = 1$.
138-
126+
* If we think $y_0$ is drawn from the stationary distribution ${\mathcal N}(0, \frac{\sigma_x^{2}}{1-\rho^2})$, then it is a good idea to use this distribution as $f(y_0)$.
127+
- Why? Because $y_0$ contains information about $\rho, \sigma_x$.
128+
* If we suspect that $y_0$ is far in the tail of the stationary distribution -- so that variation in early observations in the sample has a significant *transient component* -- it is better to condition on $y_0$ by setting $f(y_0) = 1$.
139129
140130
To illustrate the issue, we'll begin by choosing an initial $y_0$ that is far out in the tail of the stationary distribution.
141131
@@ -238,7 +228,7 @@ plot_posterior(mcmc.get_samples())
238228
239229
Evidently, the posteriors aren't centered on the true values of $.5, 1$ that we used to generate the data.
240230
241-
This is a symptom of the classic **Hurwicz bias** for first order autoregressive processes (see {cite}`hurwicz1950least`.)
231+
This is a symptom of the classic **Hurwicz bias** for first order autoregressive processes (see {cite}`hurwicz1950least`.)
242232
243233
The Hurwicz bias is worse the smaller is the sample (see {cite}`Orcutt_Winokur_69`).
244234
@@ -315,4 +305,4 @@ is telling `numpyro` to explain what it interprets as "explosive" observations e
315305
316306
Bayes' Law is able to generate a plausible likelihood for the first observation by driving $\rho \rightarrow 1$ and $\sigma \uparrow$ in order to raise the variance of the stationary distribution.
317307
318-
Our example illustrates the importance of what you assume about the distribution of initial conditions.
308+
Our example illustrates the importance of what you assume about the distribution of initial conditions.

0 commit comments

Comments
 (0)