Skip to content

Commit f6a52e3

Browse files
committed
fix build error for bayes_nonconj
1 parent 575b4f1 commit f6a52e3

File tree

1 file changed

+14
-23
lines changed

1 file changed

+14
-23
lines changed

lectures/bayes_nonconj.md

Lines changed: 14 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
---
22
jupytext:
33
text_representation:
4-
extension: .myst
4+
extension: .md
55
format_name: myst
66
format_version: 0.13
7-
jupytext_version: 1.13.8
7+
jupytext_version: 1.16.4
88
kernelspec:
99
display_name: Python 3 (ipykernel)
1010
language: python
@@ -43,7 +43,6 @@ The two Python modules are
4343

4444
As usual, we begin by importing some Python code.
4545

46-
4746
```{code-cell} ipython3
4847
:tags: [hide-output]
4948
@@ -80,10 +79,8 @@ from numpyro.infer import SVI as nSVI
8079
from numpyro.infer import ELBO as nELBO
8180
from numpyro.infer import Trace_ELBO as nTrace_ELBO
8281
from numpyro.optim import Adam as nAdam
83-
8482
```
8583

86-
8784
## Unleashing MCMC on a Binomial Likelihood
8885

8986
This lecture begins with the binomial example in the {doc}`quantecon lecture <prob_meaning>`.
@@ -252,7 +249,6 @@ We will use the following priors:
252249

253250
- The truncated Laplace can be created using `Numpyro`'s `TruncatedDistribution` class.
254251

255-
256252
```{code-cell} ipython3
257253
# used by Numpyro
258254
def TruncatedLogNormal_trans(loc, scale):
@@ -595,9 +591,9 @@ class BayesianInference:
595591
pyro.sample('theta', dist.Beta(alpha_q, beta_q))
596592
597593
else:
598-
alpha_q = numpyro.param('alpha_q', 10,
594+
alpha_q = numpyro.param('alpha_q', 10.0,
599595
constraint=nconstraints.positive)
600-
beta_q = numpyro.param('beta_q', 10,
596+
beta_q = numpyro.param('beta_q', 10.0,
601597
constraint=nconstraints.positive)
602598
603599
numpyro.sample('theta', ndist.Beta(alpha_q, beta_q))
@@ -655,32 +651,31 @@ class BayesianInference:
655651
params : the learned parameters for guide
656652
losses : a vector of loss at each step
657653
"""
658-
# tensorize data
659-
if not torch.is_tensor(data):
660-
data = torch.tensor(data)
654+
# Convert data to float32
655+
data = np.asarray(data, dtype=np.float32)
661656
662657
# initiate SVI
663658
svi = self.SVI_init(guide_dist=guide_dist)
664659
665660
# do gradient steps
666-
if self.solver=='pyro':
661+
if self.solver == 'pyro':
667662
# store loss vector
668-
losses = np.zeros(n_steps)
663+
losses = np.zeros(n_steps, dtype=np.float32)
669664
for step in range(n_steps):
670665
losses[step] = svi.step(data)
671666
672667
# pyro only supports beta VI distribution
673668
params = {
674669
'alpha_q': pyro.param('alpha_q').item(),
675670
'beta_q': pyro.param('beta_q').item()
676-
}
671+
}
677672
678-
elif self.solver=='numpyro':
673+
elif self.solver == 'numpyro':
679674
result = svi.run(self.rng_key, n_steps, data, progress_bar=False)
680-
params = dict(
681-
(key, np.asarray(value)) for key, value in result.params.items()
682-
)
683-
losses = np.asarray(result.losses)
675+
params = {
676+
key: np.asarray(value, dtype=np.float32) for key, value in result.params.items()
677+
}
678+
losses = np.asarray(result.losses, dtype=np.float32)
684679
685680
return params, losses
686681
```
@@ -898,7 +893,6 @@ For the same Beta prior, we shall
898893
899894
Let's start with the analytical method that we described in this quantecon lecture <https://python.quantecon.org/prob_meaning.html>
900895
901-
902896
```{code-cell} ipython3
903897
# First examine Beta priors
904898
BETA_pyro = BayesianInference(param=(5,5), name_dist='beta', solver='pyro')
@@ -952,12 +946,10 @@ will be more accurate, as we shall see next.
952946
953947
(Increasing the step size increases computational time though).
954948
955-
956949
```{code-cell} ipython3
957950
BayesianInferencePlot(true_theta, num_list, BETA_numpyro).SVI_plot(guide_dist='beta', n_steps=100000)
958951
```
959952
960-
961953
## Non-conjugate Prior Distributions
962954
963955
Having assured ourselves that our MCMC and VI methods can work well when we have conjugate prior and so can also compute analytically, we
@@ -1052,7 +1044,6 @@ SVI_num_steps = 50000
10521044
example_CLASS = BayesianInference(param=(0,1), name_dist='uniform', solver='numpyro')
10531045
print(f'=======INFO=======\nParameters: {example_CLASS.param}\nPrior Dist: {example_CLASS.name_dist}\nSolver: {example_CLASS.solver}')
10541046
BayesianInferencePlot(true_theta, num_list, example_CLASS).SVI_plot(guide_dist='normal', n_steps=SVI_num_steps)
1055-
10561047
```
10571048
10581049
```{code-cell} ipython3

0 commit comments

Comments
 (0)