You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/mle.md
+8-8Lines changed: 8 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,7 +27,7 @@ kernelspec:
27
27
28
28
## Overview
29
29
30
-
In a {doc}`previous lecture <ols>`, we estimated the relationship between
30
+
In {doc}`ols`, we estimated the relationship between
31
31
dependent and explanatory variables using linear regression.
32
32
33
33
But what if a linear relationship is not an appropriate assumption for our model?
@@ -64,11 +64,11 @@ from mpl_toolkits.mplot3d import Axes3D
64
64
65
65
We assume familiarity with basic probability and multivariate calculus.
66
66
67
-
## Set Up and Assumptions
67
+
## Set up and assumptions
68
68
69
69
Let's consider the steps we need to go through in maximum likelihood estimation and how they pertain to this study.
70
70
71
-
### Flow of Ideas
71
+
### Flow of ideas
72
72
73
73
The first step with maximum likelihood estimation is to choose the probability distribution believed to be generating the data.
74
74
@@ -85,7 +85,7 @@ We'll let the data pick out a particular element of the class by pinning down th
85
85
86
86
The parameter estimates so produced will be called **maximum likelihood estimates**.
87
87
88
-
### Counting Billionaires
88
+
### Counting billionaires
89
89
90
90
Treisman {cite}`Treisman2016` is interested in estimating the number of billionaires in different countries.
91
91
@@ -170,7 +170,7 @@ plt.show()
170
170
171
171
From the histogram, it appears that the Poisson assumption is not unreasonable (albeit with a very low $\mu$ and some outliers).
172
172
173
-
## Conditional Distributions
173
+
## Conditional distributions
174
174
175
175
In Treisman's paper, the dependent variable --- the number of billionaires $y_i$ in country $i$ --- is modeled as a function of GDP per capita, population size, and years membership in GATT and WTO.
176
176
@@ -238,7 +238,7 @@ plt.show()
238
238
We can see that the distribution of $y_i$ is conditional on
239
239
$\mathbf{x}_i$ ($\mu_i$ is no longer constant).
240
240
241
-
## Maximum Likelihood Estimation
241
+
## Maximum likelihood estimation
242
242
243
243
In our model for number of billionaires, the conditional distribution
244
244
contains 4 ($k = 4$) parameters that we need to estimate.
@@ -845,7 +845,7 @@ Probit model.
845
845
To begin, find the log-likelihood function and derive the gradient and
846
846
Hessian.
847
847
848
-
The `scipy` module `stats.norm` contains the functions needed to
848
+
The `jax.scipy.stats` module `norm` contains the functions needed to
849
849
compute the cmf and pmf of the normal distribution.
850
850
```
851
851
@@ -990,7 +990,7 @@ newton_raphson(prob, β)
990
990
991
991
```{code-cell} ipython3
992
992
# Use statsmodels to verify results
993
-
# Note that use __array__() method to convert jax to numpy arrays
993
+
# Note: use __array__() method to convert jax to numpy arrays
0 commit comments