You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/time_series/Time_Series_Generative_Graph.myst.md
+28-16Lines changed: 28 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ kernelspec:
10
10
name: python3
11
11
---
12
12
13
-
(arima_garch_1_1)=
13
+
(time_series_generative_graph)=
14
14
# Time Series Models Derived From a Generative Graph
15
15
16
16
:::{post} March, 2024
@@ -21,7 +21,7 @@ kernelspec:
21
21
22
22
+++
23
23
24
-
In This notebook, we show to model and fit a time series model starting from a generative graph. In particular, we explain how to use {class}`~pytensor.scan` to loop efficiently inside a PyMC model.
24
+
In This notebook, we show to model and fit a time series model starting from a generative graph. In particular, we explain how to use {func}`~pytensor.scan` to loop efficiently inside a PyMC model.
25
25
26
26
For this example, we consider an autoregressive model AR(2). Recall that an AR(2) model is defined as:
We start by encoding the generative graph of the AR(2) model as a function `ar_dist`. The strategy is to pass this function as a custom distribution via {class}`pm.CustomDist` inside a PyMC model.
60
+
We start by encoding the generative graph of the AR(2) model as a function `ar_dist`. The strategy is to pass this function as a custom distribution via {class}`~pm.CustomDist` inside a PyMC model.
61
61
62
-
We need to specify the initial state (`ar_init`), the autoregressive coefficients (`rho`), and the standard deviation of the noise (`sigma`). Given such parameters, we can define the generative graph of the AR(2) model using the {class}`~pytensor.scan` operation.
62
+
We need to specify the initial state (`ar_init`), the autoregressive coefficients (`rho`), and the standard deviation of the noise (`sigma`). Given such parameters, we can define the generative graph of the AR(2) model using the {func}`~pytensor.scan` operation.
63
63
64
64
```{code-cell} ipython3
65
65
lags = 2 # Number of lags
66
66
trials = 100 # Time series length
67
67
68
68
69
-
def ar_dist(ar_init, rho, sigma):
69
+
def ar_dist(ar_init, rho, sigma, size):
70
70
def ar_step(x_tm2, x_tm1, rho, sigma):
71
71
mu = x_tm1 * rho[0] + x_tm2 * rho[1]
72
72
x = mu + pm.Normal.dist(sigma=sigma)
@@ -144,7 +144,7 @@ for i, hdi_prob in enumerate((0.94, 0.64), 1):
We now run the MCMC algorithm to sample from the posterior distribution.
189
+
183
190
```{code-cell} ipython3
184
191
with model:
185
192
trace = pm.sample(random_seed=rng)
186
193
```
187
194
195
+
Let's plot the trace and the posterior distribution of the parameters.
196
+
188
197
```{code-cell} ipython3
189
198
axes = az.plot_trace(
190
199
data=trace,
@@ -206,8 +215,14 @@ axes = az.plot_posterior(
206
215
plt.gcf().suptitle("AR(2) Model Parameters Posterior", fontsize=18, fontweight="bold")
207
216
```
208
217
218
+
We see we have successfully recovered the true parameters of the model.
219
+
220
+
+++
221
+
209
222
## Posterior Predictive
210
223
224
+
Finally, we can use the posterior samples to generate new data from the AR(2) model. We can then compare the generated data with the observed data to check the goodness of fit of the model.
Overall, we the model is capturing the global dynamics of the time series. In order to have a abetter insight of the model, we can plot a subset of the posterior samples and compare them with the observed data.
0 commit comments