Skip to content

Commit b1629e8

Browse files
committed
refactor imports and fix typos
1 parent 5be7a1f commit b1629e8

File tree

1 file changed

+13
-12
lines changed

1 file changed

+13
-12
lines changed

lectures/kalman.md

Lines changed: 13 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -57,15 +57,15 @@ Required knowledge: Familiarity with matrix manipulations, multivariate normal d
5757
We'll need the following imports:
5858

5959
```{code-cell} ipython3
60-
import matplotlib.pyplot as plt
61-
from scipy import linalg
6260
import jax
6361
import jax.numpy as jnp
62+
import matplotlib.pyplot as plt
6463
import matplotlib.cm as cm
65-
from quantecon import Kalman, LinearStateSpace
64+
from scipy import linalg
6665
from scipy.stats import norm
6766
from scipy.integrate import quad
6867
from scipy.linalg import eigvals
68+
from quantecon import Kalman, LinearStateSpace
6969
```
7070

7171
## The Basic Idea
@@ -100,7 +100,7 @@ p = N(\hat{x}, \Sigma)
100100
```
101101

102102
where $\hat{x}$ is the mean of the distribution and $\Sigma$ is a
103-
$2 \times 2$ covariance matrix. In our simulations, we will suppose that
103+
$2 \times 2$ covariance matrix. In our simulations, we will suppose that
104104

105105
```{math}
106106
:label: kalman_dhxs
@@ -201,7 +201,7 @@ plt.show()
201201

202202
We are now presented with some good news and some bad news.
203203

204-
The good news is that the missile has been located by our sensors, which report that the current location is $y = (2.3, -1.9)^top$.
204+
The good news is that the missile has been located by our sensors, which report that the current location is $y = (2.3, -1.9)\top$.
205205

206206
The next figure shows the original prior $p(x)$ and the new reported
207207
location $y$.
@@ -285,17 +285,18 @@ where
285285
Here $\Sigma G' (G \Sigma G' + R)^{-1}$ is the matrix of population regression coefficients of the hidden object $x - \hat{x}$ on the surprise $y - G \hat{x}$.
286286

287287
We can verify it by computing
288+
288289
$$
289290
\begin{aligned}
290-
\mathrm{Cov}(x - \hat{x}, y - G \hat{x})\mathrm{Var}(y - G \hat{x})^{-1}
291+
&\mathrm{Cov}(x - \hat{x}, y - G \hat{x})\mathrm{Var}(y - G \hat{x})^{-1} \\
291292
&= \mathrm{Cov}(x - \hat{x}, G x + v - G \hat{x})\mathrm{Var}(G x + v - G \hat{x})^{-1}\\
292293
&= \Sigma G'(G \Sigma G' + R)^{-1}
293294
\end{aligned}
294295
$$
295296

296297
This new density $p(x \,|\, y) = N(\hat{x}^F, \Sigma^F)$ is shown in the next figure via contour lines and the color map.
297298

298-
The original density is left in as contour lines for comparison
299+
The original density is left in as contour lines for comparison.
299300

300301
```{code-cell} ipython3
301302
---
@@ -409,7 +410,7 @@ Our updated prediction is the density $N(\hat x_{new}, \Sigma_{new})$ where
409410
* The density $p_{new}(x) = N(\hat x_{new}, \Sigma_{new})$ is called the **predictive distribution**
410411

411412
The predictive distribution is the new density shown in the following figure, where
412-
the update has used parameters.
413+
the update has used the following parameters:
413414

414415
$$
415416
A
@@ -487,7 +488,7 @@ Swapping notation $p_t(x)$ for $p(x)$ and $p_{t+1}(x)$ for $p_{new}(x)$, the ful
487488

488489
1. Start the current period with prior $p_t(x) = N(\hat x_t, \Sigma_t)$.
489490
1. Observe current measurement $y_t$.
490-
1. Compute the filtering distribution $p_t(x \,|\, y) = N(\hat x_t^F, \Sigma_t^F)$ from $p_t(x)$ and $y_t$, applying Bayes rule and the conditional distribution {eq}`kl_measurement_model`.
491+
1. Compute the filtering distribution $p_t(x \,|\, y) = N(\hat x_t^F, \Sigma_t^F)$ from $p_t(x)$ and $y_t$, applying Bayes' rule and the conditional distribution {eq}`kl_measurement_model`.
491492
1. Compute the predictive distribution $p_{t+1}(x) = N(\hat x_{t+1}, \Sigma_{t+1})$ from the filtering distribution and {eq}`kl_xdynam`.
492493
1. Increment $t$ by one and go to step 1.
493494

@@ -637,7 +638,7 @@ kalman = Kalman(ss, x_hat_0, Σ_0)
637638
638639
# Draw observations of y from state space model
639640
N = 5
640-
seed = 1234 # Set random seed
641+
seed = 1234 # Set random seed
641642
x, y = ss.simulate(N, seed)
642643
y = y.flatten()
643644
@@ -707,7 +708,7 @@ x, y = ss.simulate(T, seed)
707708
y = y.flatten()
708709
709710
for t in range(T):
710-
# Record the current predicted mean and variance and plot their densities
711+
# Record the current predicted mean and variance
711712
m, v = kalman.x_hat.item(), kalman.Sigma.item()
712713
713714
# Wrap parameters
@@ -836,7 +837,7 @@ e1 = jnp.empty(T-1)
836837
e2 = jnp.empty(T-1)
837838
838839
for t in range(1, T):
839-
kn.update(y[:,t])
840+
kn.update(y[:, t])
840841
diff1 = x[:, t] - kn.x_hat.flatten()
841842
diff2 = x[:, t] - A @ x[:, t-1]
842843
e1 = e1.at[t-1].set(diff1 @ diff1)

0 commit comments

Comments
 (0)