|
2 | 2 |
|
3 | 3 | The release log for BoTorch. |
4 | 4 |
|
| 5 | +## [0.8.0] - Dec 6, 2022 |
| 6 | + |
| 7 | +### Highlights |
| 8 | +This release includes some backwards incompatible changes. |
| 9 | +* Refactor `Posterior` and `MCSampler` modules to better support non-Gaussian distributions in BoTorch (#1486). |
| 10 | + * Introduced a `TorchPosterior` object that wraps a PyTorch `Distribution` object and makes it compatible with the rest of `Posterior` API. |
| 11 | + * `PosteriorList` no longer accepts Gaussian base samples. It should be used with a `ListSampler` that includes the appropriate sampler for each posterior. |
| 12 | + * The MC acquisition functions no longer construct a Sobol sampler by default. Instead, they rely on a `get_sampler` helper, which dispatches an appropriate sampler based on the posterior provided. |
| 13 | + * The `resample` and `collapse_batch_dims` arguments to `MCSampler`s have been removed. The `ForkedRNGSampler` and `StochasticSampler` can be used to get the same functionality. |
| 14 | + * Refer to the PR for additional changes. We will update the website documentation to reflect these changes in a future release. |
| 15 | +* #1191 refactors much of `botorch.optim` to operate based on closures that abstract |
| 16 | +away how losses (and gradients) are computed. By default, these closures are created |
| 17 | +using multiply-dispatched factory functions (such as `get_loss_closure`), which may be |
| 18 | +customized by registering methods with an associated dispatcher (e.g. `GetLossClosure`). |
| 19 | +Future releases will contain tutorials that explore these features in greater detail. |
| 20 | + |
| 21 | +#### New Features |
| 22 | +* Add mixed optimization for list optimization (#1342). |
| 23 | +* Add entropy search acquisition functions (#1458). |
| 24 | +* Add utilities for straight-through gradient estimators for discretization functions (#1515). |
| 25 | +* Add support for categoricals in Round input transform and use STEs (#1516). |
| 26 | +* Add closure-based optimizers (#1191). |
| 27 | + |
| 28 | +#### Other Changes |
| 29 | +* Do not count hitting maxiter as optimization failure & update default maxiter (#1478). |
| 30 | +* `BoxDecomposition` cleanup (#1490). |
| 31 | +* Deprecate `torch.triangular_solve` in favor of `torch.linalg.solve_triangular` (#1494). |
| 32 | +* Various docstring improvements (#1496, #1499, #1504). |
| 33 | +* Remove `__getitem__` method from `LinearTruncatedFidelityKernel` (#1501). |
| 34 | +* Handle Cholesky errors when fitting a fully Bayesian model (#1507). |
| 35 | +* Make eta configurable in `apply_constraints` (#1526). |
| 36 | +* Support SAAS ensemble models in RFFs (#1530). |
| 37 | +* Deprecate `botorch.optim.numpy_converter` (#1191). |
| 38 | +* Deprecate `fit_gpytorch_scipy` and `fit_gpytorch_torch` (#1191). |
| 39 | + |
| 40 | +#### Bug Fixes |
| 41 | +* Enforce use of float64 in `NdarrayOptimizationClosure` (#1508). |
| 42 | +* Replace deprecated np.bool with equivalent bool (#1524). |
| 43 | +* Fix RFF bug when using FixedNoiseGP models (#1528). |
| 44 | + |
| 45 | + |
5 | 46 | ## [0.7.3] - Nov 10, 2022 |
6 | 47 |
|
7 | 48 | ### Highlights |
|
0 commit comments