TensorFlow Probability 0.15.0
Release notes
This is the 0.15 release of TensorFlow Probability. It is tested and stable against TensorFlow version 2.7.0.
Change notes
-
Distributions
- Add
tfd.StudentTProcessRegressionModel. - Distributions' statistics now all have batch shape matching the Distribution itself.
JointDistributionCoroutineno longer requiresRootwhensample_shape==().- Support
sample_distributionsfrom autobatched joint distributions. - Expose
maskargument to support missing observations in HMM log probs. BetaBinomial.log_probis more accurate when all trials succeed.- Support broadcast batch shapes in
MixtureSameFamily. - Add
cholesky_fnargument toGaussianProcess,GaussianProcessRegressionModel, andSchurComplement. - Add staticmethod for precomputing GPRM for more efficient inference in TensorFlow.
- Add
GaussianProcess.posterior_predictive.
- Add
-
Bijectors
- Bijectors parameterized by distinct
tf.Variables no longer register as==. - BREAKING CHANGE: Remove deprecated
AffineScalarbijector. Please usetfb.Shift(shift)(tfb.Scale(scale))instead. - BREAKING CHANGE: Remove deprecated
AffineandAffineLinearOperatorbijectors.
- Bijectors parameterized by distinct
-
PSD kernels
- Add
tfp.math.psd_kernels.ChangePoint. - Add slicing support for
PositiveSemidefiniteKernel. - Add
inverse_length_scaleparameter to kernels. - Add
parameter_propertiesto PSDKernel along with automated batch shape inference.
- Add
-
VI
- Add support for importance-weighted variational objectives.
- Support arbitrary distribution types in
tfp.experimental.vi.build_factored_surrogate_posterior.
-
STS
- Support
+syntax for summingStructuralTimeSeriesmodels.
- Support
-
Math
- Enable JAX/NumPy backends for
tfp.math.ode. - Allow returning auxiliary information from
tfp.math.value_and_gradient.
- Enable JAX/NumPy backends for
-
Experimental
- Speedup to
experimental.mcmcwindowed samplers. - Support unbiased gradients through particle filtering via stop-gradient resampling.
ensemble_kalman_filter_log_marginal_likelihood(log evidence) computation added totfe.sequential.- Add experimental joint-distribution layers library.
- Delete
tfp.experimental.distributions.JointDensityCoroutine. - Add experimental special functions for high-precision computation on a TPU.
- Add custom log-prob ratio for
IncrementLogProb. - Use
foldlinno_pivot_ldlinstead ofwhile_loop.
- Speedup to
-
Other
- TFP should now support numpy 1.20+.
- BREAKING CHANGE: Stock unpacking seeds when splitting in JAX.
Huge thanks to all the contributors to this release!
- 8bitmp3
- adriencorenflos
- Alexey Radul
- Allen Lavoie
- Ben Lee
- Billy Lamberta
- Brian Patton
- Christopher Suter
- Colin Carroll
- Dave Moore
- Du Phan
- Emily Fertig
- Faizan Muhammad
- George Necula
- George Tucker
- Grace Luo
- Ian Langmore
- Jacob Burnim
- Jake VanderPlas
- Jeremiah Liu
- Junpeng Lao
- Kaan
- Luke Wood
- Max Jiang
- Mihai Maruseac
- Neil Girdhar
- Paul Chiang
- Pavel Izmailov
- Pavel Sountsov
- Peter Hawkins
- Rebecca Chen
- Richard Song
- Rif A. Saurous
- Ron Shapiro
- Roy Frostig
- Sharad Vikram
- Srinivas Vasudevan
- Tomohiro Endo
- Urs Köster
- William C Grisaitis
- Yilei Yang