Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 25 additions & 11 deletions lectures/prob_matrix.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Among concepts that we'll be studying include

We'll use a matrix to represent a bivariate or multivariate probability distribution and a vector to represent a univariate probability distribution

This {doc}`companion lecture <stats_examples>` describes some popular probability distributions and uses Python to sample from them.
This {doc}`companion lecture <stats_examples>` describes some popular probability distributions and describes how to use Python to sample from them.


In addition to what's in Anaconda, this lecture will need the following libraries:
Expand Down Expand Up @@ -430,10 +430,11 @@ $$

$$
\textrm{Prob}\{X=i|Y=j\} =\frac{\textrm{Prob}\{X=i,Y=j\}}{\textrm{Prob}\{Y=j\}}=\frac{\textrm{Prob}\{Y=j|X=i\}\textrm{Prob}\{X=i\}}{\textrm{Prob}\{Y=j\}}
$$
$$ (eq:condprobbayes)

```{note}
This can be interpreted as a version of what a Bayesian calls **Bayes' Law**.
Formula {eq}`eq:condprobbayes` is also what a Bayesian calls **Bayes' Law**. A Bayesian statistician regards marginal probability distribution $\textrm{Prob}({X=i}), i = 1, \ldots, J$ as a **prior** distribution that describes his personal subjective beliefs about $X$.
He then interprets formula {eq}`eq:condprobbayes` as a procedure for constructing a **posterior** distribution that describes how he would revise his subjective beliefs after observing that $Y$ equals $j$.
```


Expand Down Expand Up @@ -588,9 +589,15 @@ Marginal distributions are
$$ \textrm{Prob}(X=i)=\sum_j{f_{ij}}=u_i $$
$$ \textrm{Prob}(Y=j)=\sum_i{f_{ij}}=v_j $$

Below we draw some samples confirm that the "sampling" distribution agrees well with the "population" distribution.

**Sample results:**
**Sampling:**

Let's write some Python code that let's us draw some long samples and compute relative frequencies.

The code will let us check whether the "sampling" distribution agrees with the "population" distribution - confirming that
the population distribution correctly tells us the relative frequencies that we should expect in a large sample.



```{code-cell} ipython3
# specify parameters
Expand All @@ -615,7 +622,9 @@ x[1, p < f_cum[0]] = ys[0]
print(x)
```

Here, we use exactly the inverse CDF technique to generate sample from the joint distribution $F$.
```{note}
To generate random draws from the joint distribution $F$, we use the inverse CDF technique described in {doc}`this companion lecture <stats_examples>`.
```

```{code-cell} ipython3
# marginal distribution
Expand Down Expand Up @@ -715,9 +724,10 @@ x=x_2 & \vdots & \frac{0.1}{0.5}=0.2 & \frac{0.4}{0.5}=0.8 \\
\end{array}\right]
$$

These population objects closely resemble sample counterparts computed above.
These population objects closely resemble the sample counterparts computed above.

Let's wrap some of the functions we have used in a Python class for a general discrete bivariate joint distribution.
Let's wrap some of the functions we have used in a Python class that will let us generate and sample from a
discrete bivariate joint distribution.

```{code-cell} ipython3
class discrete_bijoint:
Expand Down Expand Up @@ -951,7 +961,7 @@ ax.set_xticks([])
plt.show()
```

Next we can simulate from a built-in `numpy` function and calculate a **sample** marginal distribution from the sample mean and variance.
Next we can use a built-in `numpy` function to draw random samples, then calculate a **sample** marginal distribution from the sample mean and variance.

```{code-cell} ipython3
μ= np.array([0, 5])
Expand Down Expand Up @@ -984,7 +994,7 @@ plt.show()

**Conditional distribution**

The population conditional distribution is
For a bivariate normal population distribution, the conditional distributions are also normal:

$$
\begin{aligned} \\
Expand All @@ -993,6 +1003,10 @@ $$
\end{aligned}
$$

```{note}
Please see this {doc}`quantecon lecture <multivariate_normal>` for more details.
```

Let's approximate the joint density by discretizing and mapping the approximating joint density into a matrix.

We can compute the discretized marginal density by just using matrix algebra and noting that
Expand Down Expand Up @@ -1221,7 +1235,7 @@ But the joint distributions differ.
Thus, multiple joint distributions $[f_{ij}]$ can have the same marginals.

**Remark:**
- Couplings are important in optimal transport problems and in Markov processes.
- Couplings are important in optimal transport problems and in Markov processes. Please see this {doc}`lecture about optimal transport <opt_transport>`

## Copula Functions

Expand Down
Loading