Skip to content

Commit 996aeb6

Browse files
Tom's July 21 edits of a lecture
1 parent 19b57f8 commit 996aeb6

File tree

1 file changed

+178
-2
lines changed

1 file changed

+178
-2
lines changed

lectures/time_series_with_matrices.md

Lines changed: 178 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -249,7 +249,7 @@ plt.ylabel('y')
249249
plt.show()
250250
```
251251

252-
## Adding a random term
252+
## Adding a Random Term
253253

254254
To generate some excitement, we'll follow in the spirit of the great economists
255255
Eugen Slutsky and Ragnar Frisch and replace our original second-order difference
@@ -349,7 +349,183 @@ plt.ylabel('y')
349349
plt.show()
350350
```
351351

352-
## A forward looking model
352+
353+
## Computing Population Moments
354+
355+
356+
We can apply standard formulas for multivariate normal distributions to compute the mean vector and covariance matrix
357+
for our time series model
358+
359+
$$
360+
y = A^{-1} (b + u) .
361+
$$
362+
363+
You can read about multivariate normal distributions in this lecture {doc}`Multivariate Normal Distribution <multivariate_normal>`.
364+
365+
Let's write our model as
366+
367+
$$
368+
y = \tilde A (b + u)
369+
$$
370+
371+
where $\tilde A = A^{-1}$.
372+
373+
Because linear combinations of normal random variables are normal, we know that
374+
375+
$$
376+
y \sim {\mathcal N}(\mu_y, \Sigma_y)
377+
$$
378+
379+
where
380+
381+
$$
382+
\mu_y = \tilde A b
383+
$$
384+
385+
and
386+
387+
$$
388+
\Sigma_y = \tilde A (\sigma_u^2 I_{T \times T} ) \tilde A^T
389+
$$
390+
391+
Let's write a Python class that computes the mean vector $\mu_y$ and covariance matrix $\Sigma_y$.
392+
393+
394+
395+
```{code-cell} ipython3
396+
class population_moments:
397+
"""
398+
Compute population moments mu_y, Sigma_y.
399+
---------
400+
Parameters:
401+
alpha0, alpha1, alpha2, T, y_1, y0
402+
"""
403+
def __init__(self, alpha0, alpha1, alpha2, T, y_1, y0, sigma_u):
404+
405+
# compute A
406+
A = np.identity(T)
407+
408+
for i in range(T):
409+
if i-1 >= 0:
410+
A[i, i-1] = -alpha1
411+
412+
if i-2 >= 0:
413+
A[i, i-2] = -alpha2
414+
415+
# compute b
416+
b = np.full(T, alpha0)
417+
b[0] = alpha0 + alpha1 * y0 + alpha2 * y_1
418+
b[1] = alpha0 + alpha2 * y0
419+
420+
# compute A inverse
421+
A_inv = np.linalg.inv(A)
422+
423+
self.A, self.b, self.A_inv, self.sigma_u, self.T = A, b, A_inv, sigma_u, T
424+
425+
def sample_y(self, n):
426+
"""
427+
Give a sample of size n of y.
428+
"""
429+
A_inv, sigma_u, b, T = self.A_inv, self.sigma_u, self.b, self.T
430+
us = np.random.normal(0, sigma_u, size=[n, T])
431+
ys = np.vstack([A_inv @ (b + u) for u in us])
432+
433+
return ys
434+
435+
def get_moments(self):
436+
"""
437+
Compute the population moments of y.
438+
"""
439+
A_inv, sigma_u, b = self.A_inv, self.sigma_u, self.b
440+
441+
# compute mu_y
442+
self.mu_y = A_inv @ b
443+
self.Sigma_y = sigma_u**2 * (A_inv @ A_inv.T)
444+
445+
return self.mu_y, self.Sigma_y
446+
447+
448+
my_process = population_moments(
449+
alpha0=10.0, alpha1=1.53, alpha2=-.9, T=80, y_1=28., y0=24., sigma_u=1)
450+
451+
mu_y, Sigma_y = my_process.get_moments()
452+
```
453+
454+
It is enlightening to study the $\mu_y, \Sigma_y$'s implied by various parameter values.
455+
456+
Among other things, we can use the class to exhibit how **statistical stationarity** of $y$ prevails only for very special initial conditions.
457+
458+
Let's begin by generating $N$ time realizations of $y$ plotting them together with population mean $\mu_y$ .
459+
460+
```{code-cell} ipython3
461+
# plot mean
462+
N = 100
463+
464+
for i in range(N):
465+
col = cm.viridis(np.random.rand()) # Choose a random color from viridis
466+
ys = my_process.sample_y(N)
467+
plt.plot(ys[i,:], lw=0.5, color=col)
468+
plt.plot(mu_y, color='red')
469+
470+
plt.xlabel('t')
471+
plt.ylabel('y')
472+
473+
plt.show()
474+
```
475+
476+
Visually, notice how the variance across realizations of $y_t$ decreases as $t$ increases.
477+
478+
Let's plot the population variance of $y_t$ against $t$.
479+
480+
```{code-cell} ipython3
481+
# plot variance
482+
plt.plot(Sigma_y.diagonal())
483+
plt.show()
484+
```
485+
486+
Notice how the population variance increases and asymptotes
487+
488+
+++
489+
490+
Let's print out the covariance matrix $\Sigma_y$ for a time series $y$
491+
492+
```{code-cell} ipython3
493+
my_process = population_moments(alpha0=0, alpha1=.8, alpha2=0, T=6, y_1=0., y0=0., sigma_u=1)
494+
495+
mu_y, Sigma_y = my_process.get_moments()
496+
print("mu_y = ",mu_y)
497+
print("Sigma_y = ", Sigma_y)
498+
```
499+
500+
Notice that the covariance between $y_t$ and $y_{t-1}$ -- the elements on the superdiagonal -- are **not** identical.
501+
502+
This is is an indication that the time series respresented by our $y$ vector is not **stationary**.
503+
504+
To make it stationary, we'd have to alter our system so that our **initial conditions** $(y_1, y_0)$ are not fixed numbers but instead a jointly normally distributed random vector with a particular mean and covariance matrix.
505+
506+
We describe how to do that in another lecture in this lecture {doc}`Linear State Space Models <linear_models>`.
507+
508+
But just to set the stage for that analysis, let's increase $T$ to 100 and print out the bottom right corner of $\Sigma_y$.
509+
510+
```{code-cell} ipython3
511+
my_process = population_moments(alpha0=0, alpha1=.8, alpha2=0, T=100, y_1=0., y0=0., sigma_u=1)
512+
513+
mu_y, Sigma_y = my_process.get_moments()
514+
print("bottom right corner of Sigma_y = \n", Sigma_y[95:,95:])
515+
```
516+
517+
Please notice how the sub diagonal and super diagonal elements seem to have converged.
518+
519+
This is an indication that our process is asymptotically stationary.
520+
521+
You can read about stationarity of more general linear time series models in this lecture {doc}`Linear State Space Models <linear_models>`.
522+
523+
There is a lot to be learned about the process by staring at the off diagonal elements of $\Sigma_y$ corresponding to different time periods $t$, but we resist the temptation to do so here.
524+
525+
+++
526+
527+
528+
## A Forward Looking Model
353529

354530
Samuelson’s model is **backwards looking** in the sense that we give it **initial conditions** and let it
355531
run.

0 commit comments

Comments
 (0)