Skip to content

Commit 9bdb5d4

Browse files
twieckispringcoil
authored andcommitted
Reverse order of tau and sd in Normal-related likelihoods (#1398)
* MAINT Revert order of tau and sd in all Normal likelihoods as well as add cov to multivariate normal. * DOC Update examples to use tau explicitly where it relied on positional arguments before. DOC Add new calling signatures to release notes.
1 parent 1043f21 commit 9bdb5d4

File tree

12 files changed

+174
-72
lines changed

12 files changed

+174
-72
lines changed

RELEASE-NOTES.md

Lines changed: 32 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,11 @@
33

44
We are proud and excited to release the first stable version of PyMC3, the product of more than [5 years](https://github.com/pymc-devs/pymc3/commit/85c7e06b6771c0d99cbc09cb68885cda8f7785cb) of ongoing development and contributions from over 80 individuals. PyMC3 is a Python module for Bayesian modeling which focuses on modern Bayesian computational methods, primarily gradient-based (Hamiltonian) MCMC sampling and variational inference. Models are specified in Python, which allows for great flexibility. The main technological difference in PyMC3 relative to previous versions is the reliance on Theano for the computational backend, rather than on Fortran extensions.
55

6+
### New features
7+
68
Since the beta release last year, the following improvements have been implemented:
79

8-
* Added `variational` submodule, which features the automatic differentiation variational inference (ADVI) fitting method. Much of this work was due to the efforts of Taku Yoshioka, and important guidance was provided by the Stan team (specifically Alp Kucukelbir and Daniel Lee).
10+
* Added `variational` submodule, which features the automatic differentiation variational inference (ADVI) fitting method. Also supports mini-batch ADVI for large data sets. Much of this work was due to the efforts of Taku Yoshioka, and important guidance was provided by the Stan team (specifically Alp Kucukelbir and Daniel Lee).
911

1012
* Added model checking utility functions, including leave-one-out (LOO) cross-validation, BPIC, WAIC, and DIC.
1113

@@ -21,15 +23,29 @@ Since the beta release last year, the following improvements have been implement
2123

2224
* Refactored test suite for better efficiency.
2325

24-
* Added von Mises, zero-inflated negative binomial, and Lewandowski, Kurowicka and Joe (LKJ) distributions.
26+
* Added von Mises, zero-inflated negative binomial, and Lewandowski, Kurowicka and Joe (LKJ) distributions.
2527

2628
* Adopted `joblib` for managing parallel computation of chains.
2729

2830
* Added contributor guidelines, contributor code of conduct and governance document.
2931

30-
We on the PyMC3 core team would like to thank everyone for contributing and now feel that this is ready for the big time. We look forward to hearing about all the cool stuff you use PyMC3 for, and look forward to continued development on the package.
32+
### Deprecations
33+
34+
* Argument order of tau and sd was switched for distributions of the normal family:
35+
- `Normal()`
36+
- `Lognormal()`
37+
- `HalfNormal()`
38+
39+
Old: `Normal(name, mu, tau)`
40+
New: `Normal(name, mu, sd)` (supplying keyword arguments is unaffected).
41+
42+
* `MvNormal` calling signature changed:
43+
Old: `MvNormal(name, mu, tau)`
44+
New: `MvNormal(name, mu, cov)` (supplying keyword arguments is unaffected).
45+
46+
We on the PyMC3 core team would like to thank everyone for contributing and now feel that this is ready for the big time. We look forward to hearing about all the cool stuff you use PyMC3 for, and look forward to continued development on the package.
3147

32-
## Contributors
48+
### Contributors
3349

3450
3551
A. Flaxman <[email protected]>
@@ -38,48 +54,48 @@ Alexey Goldin <[email protected]>
3854
Anand Patil <[email protected]>
3955
Andrea Zonca <[email protected]>
4056
Andreas Klostermann <[email protected]>
41-
Andres Asensio Ramos
57+
Andres Asensio Ramos
4258
Andrew Clegg <[email protected]>
43-
Anjum48
59+
Anjum48
4460
AustinRochford <[email protected]>
4561
Benjamin Edwards <[email protected]>
4662
Boris Avdeev <[email protected]>
4763
Brian Naughton <[email protected]>
48-
Byron Smith
64+
Byron Smith
4965
Chad Heyne <[email protected]>
5066
Chris Fonnesbeck <[email protected]>
51-
Colin
67+
Colin
5268
Corey Farwell <[email protected]>
5369
David Huard <[email protected]>
5470
David Huard <[email protected]>
5571
David Stück <[email protected]>
5672
DeliciousHair <[email protected]>
57-
Dustin Tran
73+
Dustin Tran
5874
Eigenblutwurst <[email protected]>
5975
Gideon Wulfsohn <[email protected]>
6076
Gil Raphaelli <[email protected]>
6177
62-
Ilan Man
78+
Ilan Man
6379
Imri Sofer <[email protected]>
6480
Jake Biesinger <[email protected]>
6581
James Webber <[email protected]>
6682
John McDonnell <[email protected]>
6783
John Salvatier <[email protected]>
68-
Jordi Diaz
84+
Jordi Diaz
6985
Jordi Warmenhoven <[email protected]>
7086
Karlson Pfannschmidt <[email protected]>
7187
Kyle Bishop <[email protected]>
7288
Kyle Meyer <[email protected]>
73-
Lin Xiao
89+
Lin Xiao
7490
Mack Sweeney <[email protected]>
7591
Matthew Emmett <[email protected]>
76-
Maxim
92+
Maxim
7793
Michael Gallaspy <[email protected]>
7894
7995
Osvaldo Martin <[email protected]>
8096
Patricio Benavente <[email protected]>
8197
Peadar Coyle (springcoil) <[email protected]>
82-
Raymond Roberts
98+
Raymond Roberts
8399
Rodrigo Benenson <[email protected]>
84100
Sergei Lebedev <[email protected]>
85101
Skipper Seabold <[email protected]>
@@ -88,8 +104,8 @@ The Gitter Badger <[email protected]>
88104
Thomas Kluyver <[email protected]>
89105
Thomas Wiecki <[email protected]>
90106
Tobias Knuth <[email protected]>
91-
Volodymyr
92-
Volodymyr Kazantsev
107+
Volodymyr
108+
Volodymyr Kazantsev
93109
Wes McKinney <[email protected]>
94110
Zach Ploskey <[email protected]>
95111

docs/source/notebooks/dp_mix.ipynb

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -580,8 +580,8 @@
580580
"\n",
581581
" tau = pm.Gamma('tau', 1., 1., shape=K)\n",
582582
" lambda_ = pm.Uniform('lambda', 0, 5, shape=K)\n",
583-
" mu = pm.Normal('mu', 0, lambda_ * tau, shape=K)\n",
584-
" obs = pm.Normal('obs', mu[component], lambda_[component] * tau[component],\n",
583+
" mu = pm.Normal('mu', 0, tau=lambda_ * tau, shape=K)\n",
584+
" obs = pm.Normal('obs', mu[component], tau=lambda_[component] * tau[component],\n",
585585
" observed=old_faithful_df.std_waiting.values)"
586586
]
587587
},
@@ -1188,8 +1188,9 @@
11881188
}
11891189
],
11901190
"metadata": {
1191+
"anaconda-cloud": {},
11911192
"kernelspec": {
1192-
"display_name": "Python 3",
1193+
"display_name": "Python [default]",
11931194
"language": "python",
11941195
"name": "python3"
11951196
},

docs/source/notebooks/pmf-pymc.ipynb

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1529,8 +1529,9 @@
15291529
}
15301530
],
15311531
"metadata": {
1532+
"anaconda-cloud": {},
15321533
"kernelspec": {
1533-
"display_name": "Python 3",
1534+
"display_name": "Python [default]",
15341535
"language": "python",
15351536
"name": "python3"
15361537
},
@@ -1544,7 +1545,7 @@
15441545
"name": "python",
15451546
"nbconvert_exporter": "python",
15461547
"pygments_lexer": "ipython3",
1547-
"version": "3.5.1"
1548+
"version": "3.5.2"
15481549
}
15491550
},
15501551
"nbformat": 4,

docs/source/notebooks/rugby_analytics.ipynb

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -243,10 +243,10 @@
243243
"model = pm.Model()\n",
244244
"with pm.Model() as model:\n",
245245
" # global model parameters\n",
246-
" home = pm.Normal('home', 0, .0001)\n",
246+
" home = pm.Normal('home', 0, tau=.0001)\n",
247247
" tau_att = pm.Gamma('tau_att', .1, .1)\n",
248248
" tau_def = pm.Gamma('tau_def', .1, .1)\n",
249-
" intercept = pm.Normal('intercept', 0, .0001)\n",
249+
" intercept = pm.Normal('intercept', 0, tau=.0001)\n",
250250
" \n",
251251
" # team-specific model parameters\n",
252252
" atts_star = pm.Normal(\"atts_star\", \n",
@@ -464,8 +464,9 @@
464464
}
465465
],
466466
"metadata": {
467+
"anaconda-cloud": {},
467468
"kernelspec": {
468-
"display_name": "Python 3",
469+
"display_name": "Python [default]",
469470
"language": "python",
470471
"name": "python3"
471472
},
@@ -479,7 +480,7 @@
479480
"name": "python",
480481
"nbconvert_exporter": "python",
481482
"pygments_lexer": "ipython3",
482-
"version": "3.5.1"
483+
"version": "3.5.2"
483484
}
484485
},
485486
"nbformat": 4,

pymc3/distributions/continuous.py

Lines changed: 45 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -174,18 +174,40 @@ class Normal(Continuous):
174174
----------
175175
mu : float
176176
Mean.
177-
tau : float
178-
Precision (tau > 0).
179177
sd : float
180178
Standard deviation (sd > 0).
179+
tau : float
180+
Precision (tau > 0).
181181
"""
182182

183-
def __init__(self, mu=0.0, tau=None, sd=None, *args, **kwargs):
184-
super(Normal, self).__init__(*args, **kwargs)
183+
def __init__(self, *args, **kwargs):
184+
# FIXME In order to catch the case where Normal('x', 0, .1) is
185+
# called to display a warning we have to fetch the args and
186+
# kwargs manually. After a certain period we should revert
187+
# back to the old calling signature.
188+
189+
if len(args) == 1:
190+
mu = args[0]
191+
sd = kwargs.pop('sd', None)
192+
tau = kwargs.pop('tau', None)
193+
elif len(args) == 2:
194+
warnings.warn(('The order of positional arguments to Normal()'
195+
'has changed. The new signature is:'
196+
'Normal(name, mu, sd) instead of Normal(name, mu, tau).'),
197+
DeprecationWarning)
198+
mu, sd = args
199+
tau = kwargs.pop('tau', None)
200+
else:
201+
mu = kwargs.pop('mu', 0.)
202+
sd = kwargs.pop('sd', None)
203+
tau = kwargs.pop('tau', None)
204+
185205
self.mean = self.median = self.mode = self.mu = mu
186206
self.tau, self.sd = get_tau_sd(tau=tau, sd=sd)
187207
self.variance = 1. / self.tau
188208

209+
super(Normal, self).__init__(**kwargs)
210+
189211
def random(self, point=None, size=None, repeat=None):
190212
mu, tau, sd = draw_values([self.mu, self.tau, self.sd],
191213
point=point)
@@ -219,21 +241,21 @@ class HalfNormal(PositiveContinuous):
219241
220242
Parameters
221243
----------
222-
tau : float
223-
Precision (tau > 0).
224244
sd : float
225245
Standard deviation (sd > 0).
246+
tau : float
247+
Precision (tau > 0).
226248
"""
227249

228-
def __init__(self, tau=None, sd=None, *args, **kwargs):
250+
def __init__(self, sd=None, tau=None, *args, **kwargs):
229251
super(HalfNormal, self).__init__(*args, **kwargs)
230252
self.tau, self.sd = get_tau_sd(tau=tau, sd=sd)
231253
self.mean = tt.sqrt(2 / (np.pi * self.tau))
232254
self.variance = (1. - 2 / np.pi) / self.tau
233255

234256
def random(self, point=None, size=None, repeat=None):
235-
tau = draw_values([self.tau], point=point)
236-
return generate_samples(stats.halfnorm.rvs, loc=0., scale=tau**-0.5,
257+
sd = draw_values([self.sd], point=point)
258+
return generate_samples(stats.halfnorm.rvs, loc=0., scale=sd,
237259
dist_shape=self.shape,
238260
size=size)
239261

@@ -382,7 +404,7 @@ class Beta(UnitContinuous):
382404
\alpha &= \mu \kappa \\
383405
\beta &= (1 - \mu) \kappa
384406
385-
\text{where } \kappa = \frac{\mu(1-\mu)}{\sigma^2} - 1
407+
\text{where } \kappa = \frac{\mu(1-\mu)}{\sigma^2} - 1
386408
387409
Parameters
388410
----------
@@ -554,15 +576,16 @@ class Lognormal(PositiveContinuous):
554576
Scale parameter (tau > 0).
555577
"""
556578

557-
def __init__(self, mu=0, tau=1, *args, **kwargs):
579+
def __init__(self, mu=0, sd=None, tau=None, *args, **kwargs):
558580
super(Lognormal, self).__init__(*args, **kwargs)
559581

560582
self.mu = mu
561-
self.tau = tau
562-
self.mean = tt.exp(mu + 1. / (2 * tau))
583+
self.tau, self.sd = get_tau_sd(tau=tau, sd=sd)
584+
585+
self.mean = tt.exp(mu + 1. / (2 * self.tau))
563586
self.median = tt.exp(mu)
564-
self.mode = tt.exp(mu - 1. / tau)
565-
self.variance = (tt.exp(1. / tau) - 1) * tt.exp(2 * mu + 1. / tau)
587+
self.mode = tt.exp(mu - 1. / self.tau)
588+
self.variance = (tt.exp(1. / self.tau) - 1) * tt.exp(2 * mu + 1. / self.tau)
566589

567590
def _random(self, mu, tau, size=None):
568591
samples = np.random.normal(size=size)
@@ -1199,7 +1222,7 @@ class VonMises(Continuous):
11991222
R"""
12001223
Univariate VonMises log-likelihood.
12011224
.. math::
1202-
f(x \mid \mu, \kappa) =
1225+
f(x \mid \mu, \kappa) =
12031226
\frac{e^{\kappa\cos(x-\mu)}}{2\pi I_0(\kappa)}
12041227
12051228
where :I_0 is the modified Bessel function of order 0.
@@ -1244,7 +1267,7 @@ class SkewNormal(Continuous):
12441267
R"""
12451268
Univariate skew-normal log-likelihood.
12461269
.. math::
1247-
f(x \mid \mu, \tau, \alpha) =
1270+
f(x \mid \mu, \tau, \alpha) =
12481271
2 \Phi((x-\mu)\sqrt{\tau}\alpha) \phi(x,\mu,\tau)
12491272
======== ==========================================
12501273
Support :math:`x \in \mathbb{R}`
@@ -1266,13 +1289,13 @@ class SkewNormal(Continuous):
12661289
Alternative scale parameter (tau > 0).
12671290
alpha : float
12681291
Skewness parameter.
1269-
1292+
12701293
Notes
12711294
-----
1272-
When alpha=0 we recover the Normal distribution and mu becomes the mean,
1273-
tau the precision and sd the standard deviation. In the limit of alpha
1274-
approaching plus/minus infinite we get a half-normal distribution.
1275-
1295+
When alpha=0 we recover the Normal distribution and mu becomes the mean,
1296+
tau the precision and sd the standard deviation. In the limit of alpha
1297+
approaching plus/minus infinite we get a half-normal distribution.
1298+
12761299
"""
12771300
def __init__(self, mu=0.0, sd=None, tau=None, alpha=1, *args, **kwargs):
12781301
super(SkewNormal, self).__init__(*args, **kwargs)

0 commit comments

Comments
 (0)