You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The graphs on the left portray posterior marginal distributions.
365
+
The graphs above portray posterior distributions.
366
366
367
367
## Calculating Sample Path Statistics
368
368
369
369
Our next step is to prepare Python code to compute our sample path statistics.
370
370
371
+
These statistics were originally defined as random variables with respect to $\omega$, but here we use $\{Y_t\}$ as the argument because $\omega$ is implicit.
372
+
373
+
Also, these two kinds of definitions are equivalent because $\omega$ determins path statistics only through $\{Y_t\}$
374
+
371
375
```{code-cell} ipython3
372
-
# define statistics
373
-
def next_recession(omega):
374
-
n = omega.shape[0] - 3
375
-
z = np.zeros(n, dtype=int)
376
+
# Define statistics as functions of y, the realized path values.
377
+
def next_recession(y):
378
+
n = y.shape[0] - 3
379
+
z = jnp.zeros(n, dtype=int)
376
380
377
381
for i in range(n):
378
-
z[i] = int(omega[i] <= omega[i+1] and omega[i+1] > omega[i+2] and omega[i+2] > omega[i+3])
382
+
z.at[i].set(int(y[i] <= y[i+1] and y[i+1] > y[i+2] and y[i+2] > y[i+3]))
that is sufficient for determining the value of P/N
407
414
"""
408
415
409
-
n = np.asarray(omega).shape[0] - 4
410
-
T = np.zeros(n, dtype=int)
416
+
n = jnp.asarray(y).shape[0] - 4
417
+
T = jnp.zeros(n, dtype=int)
411
418
412
419
for i in range(n):
413
-
if ((omega[i] > omega[i+1]) and (omega[i+1] > omega[i+2]) and
414
-
(omega[i+2] < omega[i+3]) and (omega[i+3] < omega[i+4])):
420
+
if ((y[i] > y[i+1]) and (y[i+1] > y[i+2]) and
421
+
(y[i+2] < y[i+3]) and (y[i+3] < y[i+4])):
415
422
T[i] = 1
416
-
elif ((omega[i] < omega[i+1]) and (omega[i+1] < omega[i+2]) and
417
-
(omega[i+2] > omega[i+3]) and (omega[i+3] > omega[i+4])):
423
+
elif ((y[i] < y[i+1]) and (y[i+1] < y[i+2]) and
424
+
(y[i+2] > y[i+3]) and (y[i+3] > y[i+4])):
418
425
T[i] = -1
419
426
420
-
up_turn = np.where(T == 1)[0][0] + 1 if (1 in T) == True else T1
421
-
down_turn = np.where(T == -1)[0][0] + 1 if (-1 in T) == True else T1
427
+
up_turn = jnp.where(T == 1)[0][0] + 1 if (1 in T) == True else T1
428
+
down_turn = jnp.where(T == -1)[0][0] + 1 if (-1 in T) == True else T1
422
429
423
430
return up_turn, down_turn
424
431
```
425
432
426
433
## Original Wecker Method
427
434
428
-
Now we apply Wecker's original method by simulating future paths and compute predictive distributions, conditioning
429
-
on the true parameters associated with the data-generating model.
435
+
Now we apply Wecker's original method by simulating future paths and compute predictive distributions, conditioning on the true parameters associated with the data-generating model.
430
436
431
437
```{code-cell} ipython3
432
438
def plot_Wecker(initial_path, N, ax):
@@ -458,11 +464,12 @@ def plot_Wecker(initial_path, N, ax):
458
464
ax[0, 0].plot(np.arange(T1), center, color='red', alpha=.7)
0 commit comments