Skip to content

Commit 92a885d

Browse files
authored
Merge branch 'main' into fix-deprecation+future
2 parents 3bca43b + 919f9b4 commit 92a885d

File tree

6 files changed

+19
-18
lines changed

6 files changed

+19
-18
lines changed

.github/workflows/cache.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ jobs:
77
cache:
88
runs-on: quantecon-gpu
99
container:
10-
image: ghcr.io/quantecon/lecture-python-container:cuda-12.6.0-anaconda-2024-10-py312-b
10+
image: ghcr.io/quantecon/lecture-python-container:cuda-12.8.1-anaconda-2024-10-py312
1111
options: --gpus all
1212
steps:
1313
- uses: actions/checkout@v4

.github/workflows/ci.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ jobs:
44
preview:
55
runs-on: quantecon-gpu
66
container:
7-
image: ghcr.io/quantecon/lecture-python-container:cuda-12.6.0-anaconda-2024-10-py312-b
7+
image: ghcr.io/quantecon/lecture-python-container:cuda-12.8.1-anaconda-2024-10-py312
88
options: --gpus all
99
steps:
1010
- uses: actions/checkout@v4

.github/workflows/publish.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ jobs:
88
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags')
99
runs-on: quantecon-gpu
1010
container:
11-
image: ghcr.io/quantecon/lecture-python-container:cuda-12.6.0-anaconda-2024-10-py312-b
11+
image: ghcr.io/quantecon/lecture-python-container:cuda-12.8.1-anaconda-2024-10-py312
1212
options: --gpus all
1313
steps:
1414
- name: Checkout

lectures/bayes_nonconj.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -395,8 +395,9 @@ We use two sets of variational distributions: Beta and TruncatedNormal with sup
395395
- Learnable parameters for the Beta distribution are (alpha, beta), both of which are positive.
396396
- Learnable parameters for the Truncated Normal distribution are (loc, scale).
397397
398-
<u> We restrict the truncated Normal paramter 'loc' to be in the interval $[0,1]$</u>.
399-
398+
```{note}
399+
We restrict the truncated Normal parameter 'loc' to be in the interval $[0,1]$
400+
```
400401
401402
## Implementation
402403

lectures/exchangeable.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -31,8 +31,8 @@ via Bayes' Law.
3131

3232
We touch foundations of Bayesian statistical inference invented by Bruno DeFinetti {cite}`definetti`.
3333

34-
The relevance of DeFinetti's work for economists is presented forcefully
35-
in chapter 11 of {cite}`Kreps88` by David Kreps.
34+
The relevance of DeFinetti's work for economists is presented forcefully by David Kreps
35+
in chapter 11 of {cite}`Kreps88`.
3636

3737
An example that we study in this lecture is a key component of {doc}`this lecture <odu>` that augments the
3838
{doc}`classic <mccall_model>` job search model of McCall
@@ -141,10 +141,10 @@ which states that the **conditional density** on the left side does not equal th
141141
But in the special IID case,
142142

143143
$$
144-
p(W_t | W_{t-1}, \ldots, W_0) = p(W_t)
144+
p(W_t | W_{t-1}, \ldots, W_0) = p(W_t) ,
145145
$$
146146

147-
and partial history $W_{t-1}, \ldots, W_0$ contains no information about the probability of $W_t$.
147+
so that the partial history $W_{t-1}, \ldots, W_0$ contains no information about the probability of $W_t$.
148148

149149
So in the IID case, there is **nothing to learn** about the densities of future random variables from past random variables.
150150

@@ -176,13 +176,13 @@ $G$.
176176
We could say that *objectively*, meaning *after* nature has chosen either $F$ or $G$, the probability that the data are generated as draws from $F$ is either $0$
177177
or $1$.
178178

179-
We now drop into this setting a partially informed decision maker who knows
179+
We now drop into this setting a partially informed decision maker who
180180

181-
- both $F$ and $G$, but
181+
- knows both $F$ and $G$, but
182182

183-
- not the $F$ or $G$ that nature drew once-and-for-all at $t = -1$
183+
- does not know whether at $t = -1$ nature had drawn $F$ or whether nature had drawn $G$ once-and-for-all
184184

185-
So our decision maker does not know which of the two distributions nature selected.
185+
Thus, although our decision maker knows $F$ and knows $G$, he does not know which of these two known distributions nature had selected to draw from.
186186

187187
The decision maker describes his ignorance with a **subjective probability**
188188
$\tilde \pi$ and reasons as if nature had selected $F$ with probability
@@ -259,12 +259,11 @@ This means that random variable $W_0$ contains information about random variab
259259

260260
So there is something to learn from the past about the future.
261261

262-
But what and how?
263262

264263
## Exchangeability
265264

266265
While the sequence $W_0, W_1, \ldots$ is not IID, it can be verified that it is
267-
**exchangeable**, which means that the ``re-ordered'' joint distributions $h(W_0, W_1)$ and $h(W_1, W_0)$
266+
**exchangeable**, which means that the joint distributions $h(W_0, W_1)$ and $h(W_1, W_0)$ of the ''re-ordered'' sequences
268267
satisfy
269268

270269
$$
@@ -280,13 +279,14 @@ appear are altered.
280279
Equation {eq}`eq_definetti` represents our instance of an exchangeable joint density over a sequence of random
281280
variables as a **mixture** of two IID joint densities over a sequence of random variables.
282281

283-
For a Bayesian statistician, the mixing parameter $\tilde \pi \in (0,1)$ has a special interpretation
284-
as a subjective **prior probability** that nature selected probability distribution $F$.
282+
A Bayesian statistician interprets the mixing parameter $\tilde \pi \in (0,1)$ as a decision maker's subjective belief -- the decision maker's **prior probability** -- that nature had selected probability distribution $F$.
285283

284+
```{note}
286285
DeFinetti {cite}`definetti` established a related representation of an exchangeable process created by mixing
287286
sequences of IID Bernoulli random variables with parameter $\theta \in (0,1)$ and mixing probability density $\pi(\theta)$
288287
that a Bayesian statistician would interpret as a prior over the unknown
289288
Bernoulli parameter $\theta$.
289+
```
290290

291291
## Bayes' Law
292292

lectures/kesten_processes.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ Composite Index for the period 1st January 2006 to 1st November 2019.
108108
```{code-cell} python3
109109
import yfinance as yf
110110
111-
s = yf.download('^IXIC', '2006-1-1', '2019-11-1')['Adj Close']
111+
s = yf.download('^IXIC', '2006-1-1', '2019-11-1', auto_adjust=False)['Adj Close']
112112
113113
r = s.pct_change()
114114

0 commit comments

Comments
 (0)