Skip to content

Commit c0621b6

Browse files
authored
fix indentation in bvar calc_ar (pymc-devs#525)
* [fix BVAR pymc-devs#523] fix indentation in bvar calc_ar * [fix BVAR pymc-devs#523] remove az.extract_dataset calls
1 parent 214f1ba commit c0621b6

File tree

2 files changed

+6215
-6675
lines changed

2 files changed

+6215
-6675
lines changed

examples/time_series/bayesian_var_model.ipynb

Lines changed: 6207 additions & 6668 deletions
Large diffs are not rendered by default.

examples/time_series/bayesian_var_model.myst.md

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,9 @@ jupytext:
55
format_name: myst
66
format_version: 0.13
77
kernelspec:
8-
display_name: Python 3.9.0 ('pymc_ar_ex')
8+
display_name: myjlabenv
99
language: python
10-
name: python3
10+
name: myjlabenv
1111
---
1212

1313
(Bayesian Vector Autoregressive Models)=
@@ -145,7 +145,7 @@ def calc_ar_step(lag_coefs, n_eqs, n_lags, df):
145145
],
146146
axis=0,
147147
)
148-
ars.append(ar)
148+
ars.append(ar)
149149
beta = pm.math.stack(ars, axis=-1)
150150
151151
return beta
@@ -275,7 +275,7 @@ az.summary(idata_fake_data, var_names=["alpha", "lag_coefs", "noise_chol_corr"])
275275
```
276276

277277
```{code-cell} ipython3
278-
az.plot_posterior(idata_fake_data, var_names=["alpha"], ref_val=[18, 8]);
278+
az.plot_posterior(idata_fake_data, var_names=["alpha"], ref_val=[8, 18]);
279279
```
280280

281281
Next we'll plot the posterior predictive distribution to check that the fitted model can capture the patterns in the observed data. This is the primary test of goodness of fit.
@@ -311,7 +311,7 @@ def plot_ppc(idata, df, group="posterior_predictive"):
311311
fig, axs = plt.subplots(2, 1, figsize=(25, 15))
312312
df = pd.DataFrame(idata_fake_data["observed_data"]["obs"].data, columns=["x", "y"])
313313
axs = axs.flatten()
314-
ppc = az.extract_dataset(idata, group=group, num_samples=100)["obs"]
314+
ppc = az.extract(idata, group=group, num_samples=100)["obs"]
315315
# Minus the lagged terms and the constant
316316
shade_background(ppc, axs, 0, "inferno")
317317
axs[0].plot(np.arange(ppc.shape[0]), ppc[:, 0, :].mean(axis=1), color="cyan", label="Mean")
@@ -409,7 +409,7 @@ def plot_ppc_macro(idata, df, group="posterior_predictive"):
409409
df = pd.DataFrame(idata["observed_data"]["obs"].data, columns=["dl_gdp", "dl_cons"])
410410
fig, axs = plt.subplots(2, 1, figsize=(20, 10))
411411
axs = axs.flatten()
412-
ppc = az.extract_dataset(idata, group=group, num_samples=100)["obs"]
412+
ppc = az.extract(idata, group=group, num_samples=100)["obs"]
413413
414414
shade_background(ppc, axs, 0, "inferno")
415415
axs[0].plot(np.arange(ppc.shape[0]), ppc[:, 0, :].mean(axis=1), color="cyan", label="Mean")
@@ -704,7 +704,7 @@ for ax, country in zip(axs, countries):
704704
idata_full_test["observed_data"][f"obs_{country}"].data,
705705
columns=["dl_gdp", "dl_cons", "dl_gfcf"],
706706
)
707-
ppc = az.extract_dataset(idata_full_test, group="posterior_predictive", num_samples=100)[
707+
ppc = az.extract(idata_full_test, group="posterior_predictive", num_samples=100)[
708708
f"obs_{country}"
709709
]
710710
if country == "Ireland":
@@ -754,6 +754,7 @@ In the next post in this series we will spend some time digging into the implied
754754

755755
## Authors
756756
* Adapted from the PYMC labs [Blog post](https://www.pymc-labs.io/blog-posts/bayesian-vector-autoregression/) and Jim Savage's discussion [here](https://rpubs.com/jimsavage/hierarchical_var) by [Nathaniel Forde](https://nathanielf.github.io/) in November 2022 ([pymc-examples#456](https://github.com/pymc-devs/pymc-examples/pull/456))
757+
* Reexecuted by Nathaniel Forde on Feb, 2023 ([pymc_examples#523](https://github.com/pymc-devs/pymc-examples/issues/523))
757758

758759
+++
759760

0 commit comments

Comments
 (0)