TensorFlow Probability 0.12.1
Release notes
This is the 0.12.1 release of TensorFlow Probability. It is tested and stable against TensorFlow version 2.4.0.
Change notes
NOTE: Links point to examples in the TFP 0.12.1 release Colab.
Bijectors:
- Add implementation of GLOW at
tfp.bijectors.Glow. - Add
RayleighCDFbijector. - Add
Ascendingbijector and deprecateOrdered. - Add optional
lowparameter to theSoftplusbijector. - Enable
ScaleMatvecLinearOperatorbijector to wrap blockwise LinearOperators to form a multipart bijectors. - Allow passing kwargs to
Blockwise. - Bijectors now share a global cache, keyed by the bijector parameters and the value being transformed.
Distributions:
- BREAKING: Remove deprecated
HiddenMarkovModel.num_statesproperty. - BREAKING: Change the naming scheme of un-named variables in JointDistributions.
- BREAKING: Remove deprecated
batch_shapeandevent_shapearguments ofTransformedDistribution. - Add
Skellamdistribution. JointDistributionCoroutine{AutoBatched}now uses namedtuples as the sample dtype.- von-Mises Fisher distribution now works for dimensions > 5 and implements
VonMisesFisher.entropy. - Add
ExpGammaandExpInverseGammadistributions. JointDistribution*AutoBatchednow support (reproducible) tensor seeds.- Add KL(VonMisesFisher || SphericalUniform).
- Added
Distribution.parameter_propertiesmethod. experimental_default_event_space_bijectornow accepts additional arguments to pin some distribution parts.- Add
JointDistribution.experimental_pinandJointDistributionPinned. - Add
NegativeBinomial.experimental_from_mean_dispersionmethod. - Add
tfp.experimental.distribute, withDistributionStrategy-aware distributions that support cross-device likelihood computations. HiddenMarkovModelcan now accept time varying observation distributions iftime_varying_observation_distributionis set.Beta,Binomial, andNegativeBinomialCDF no longer returns nan outside the support.- Remove the "dynamic graph" code path from the Mixture sampler. (
Mixturenow ignores theuse_static_graphparameter.) Mixturenow computes standard deviations more accurately and robustly.- Fix incorrect
nansamples generated by several distributions. - Fix KL divergence between
Categoricaldistributions when logits contain -inf. - Implement
Bernoulli.cdf. - Add a
log_rateparameter totfd.Gamma. - Add option for parallel filtering and sampling to
LinearGaussianStateSpaceModel.
MCMC:
- Add
tfp.experimental.mcmc.ProgressBarReducer. - Update
experimental.mcmc.sample_sequential_monte_carloto use new MCMC stateless kernel API. - Add an experimental streaming MCMC framework that supports computing statistics over a (batch of) Markov chain(s) without materializing the samples. Statistics supported (mostly on arbitrary functions of the model variables): mean, (co)variance, central moments of arbitrary rank, and the potential scale reduction factor (R-hat). Also support selectively tracing history of some but not all statistics or model variables. Add algorithms for running mean, variance, covariance, arbitrary higher central moments, and potential scale reduction factor (R-hat) to
tfp.experimental.stats. - untempered_log_prob_fn added as init kwarg to ReplicaExchangeMC Kernel.
- Add experimental support for mass matrix preconditioning in Hamiltonian Monte Carlo.
- Add ability to temper part of the log prob in ReplicaExchangeMC.
tfp.experimental.mcmc.{sample_fold,sample_chain}support warm restart.- even_odd_swap exchange function added to replica_exchange_mc.
- Samples from ReplicaExchangeMC can now have a per-replica initial state.
- Add omitted n/(n-1) term to
tfp.mcmc.potential_scale_reduction_factor. - Add
KernelBuilderandKernelOutputsto experimental. - Allow tfp.mcmc.SimpleStepSizeAdaptation and DualAveragingStepSizeAdaptation to take a custom reduction function.
- Replace
make_innermost_getteret al. withtfp.experimental.unnestutilities.
VI:
Math + Stats:
- Add
tfp.math.bessel_ive,tfp.math.bessel_kve,tfp.math.log_bessel_ive. - Add optional
weightstotfp.stats.histogram. - Add
tfp.math.erfcinv. - Add
tfp.math.reduce_log_harmonic_mean_exp.
Other:
- Add
tfp.math.psd_kernels.GeneralizedMaternKernel(generalizesMaternOneHalf,MaternThreeHalvesandMaternFiveHalves). - Add
tfp.math.psd_kernels.Parabolic. - Add
tfp.experimental.unnestutilities for accessing nested attributes. - Enable pytree flattening for TFP distributions in JAX
- More careful handling of nan and +-inf in {L-,}BFGS.
- Remove Edward2 from TFP. Edward2 is now in its own repo at https://github.com/google/edward2 .
- Support vector-valued offsets in
sts.Sum. - Make DeferredTensor actually defer computation under JAX/NumPy backends.
Huge thanks to all the contributors to this release!
- Adrian Buzea
- Alexey Radul
- Ben Lee
- Ben Poole
- Brian Patton
- Christopher Suter
- Colin Carroll
- Cyril Chimisov
- Dave Moore
- Du Phan
- Emily Fertig
- Eugene Brevdo
- Federico Tomasi
- François Chollet
- George Karpenkov
- Giovanni Palla
- Ian Langmore
- Jacob Burnim
- Jacob Valdez
- Jake VanderPlas
- Jason Zavaglia
- Jean-Baptiste Lespiau
- Jeff Pollock
- Joan Puigcerver
- Jonas Eschle
- Josh Darrieulat
- Joshua V. Dillon
- Junpeng Lao
- Kapil Sachdeva
- Kate Lin
- Kibeom Kim
- Luke Metz
- Mark Daoust
- Matteo Hessel
- Michal Brys
- Oren Bochman
- Padarn Wilson
- Pavel Sountsov
- Peter Hawkins
- Rif A. Saurous
- Ru Pei
- ST John
- Sharad Vikram
- Simeon Carstens
- Srinivas Vasudevan
- Tom O'Malley
- Tomer Kaftan
- Urs Köster
- Yash Katariya
- Yilei Yang