Skip to content

Commit 7c1b2f5

Browse files
saitcakmakfacebook-github-bot
authored andcommitted
Changelog for 0.8.0 (#1542)
Summary: Pull Request resolved: #1542 Title Reviewed By: j-wilson Differential Revision: D41752753 fbshipit-source-id: 5b07404b36751352f6b8c216f39c120889139876
1 parent 9eda189 commit 7c1b2f5

File tree

2 files changed

+43
-1
lines changed

2 files changed

+43
-1
lines changed

CHANGELOG.md

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,47 @@
22

33
The release log for BoTorch.
44

5+
## [0.8.0] - Dec 6, 2022
6+
7+
### Highlights
8+
This release includes some backwards incompatible changes.
9+
* Refactor `Posterior` and `MCSampler` modules to better support non-Gaussian distributions in BoTorch (#1486).
10+
* Introduced a `TorchPosterior` object that wraps a PyTorch `Distribution` object and makes it compatible with the rest of `Posterior` API.
11+
* `PosteriorList` no longer accepts Gaussian base samples. It should be used with a `ListSampler` that includes the appropriate sampler for each posterior.
12+
* The MC acquisition functions no longer construct a Sobol sampler by default. Instead, they rely on a `get_sampler` helper, which dispatches an appropriate sampler based on the posterior provided.
13+
* The `resample` and `collapse_batch_dims` arguments to `MCSampler`s have been removed. The `ForkedRNGSampler` and `StochasticSampler` can be used to get the same functionality.
14+
* Refer to the PR for additional changes. We will update the website documentation to reflect these changes in a future release.
15+
* #1191 refactors much of `botorch.optim` to operate based on closures that abstract
16+
away how losses (and gradients) are computed. By default, these closures are created
17+
using multiply-dispatched factory functions (such as `get_loss_closure`), which may be
18+
customized by registering methods with an associated dispatcher (e.g. `GetLossClosure`).
19+
Future releases will contain tutorials that explore these features in greater detail.
20+
21+
#### New Features
22+
* Add mixed optimization for list optimization (#1342).
23+
* Add entropy search acquisition functions (#1458).
24+
* Add utilities for straight-through gradient estimators for discretization functions (#1515).
25+
* Add support for categoricals in Round input transform and use STEs (#1516).
26+
* Add closure-based optimizers (#1191).
27+
28+
#### Other Changes
29+
* Do not count hitting maxiter as optimization failure & update default maxiter (#1478).
30+
* `BoxDecomposition` cleanup (#1490).
31+
* Deprecate `torch.triangular_solve` in favor of `torch.linalg.solve_triangular` (#1494).
32+
* Various docstring improvements (#1496, #1499, #1504).
33+
* Remove `__getitem__` method from `LinearTruncatedFidelityKernel` (#1501).
34+
* Handle Cholesky errors when fitting a fully Bayesian model (#1507).
35+
* Make eta configurable in `apply_constraints` (#1526).
36+
* Support SAAS ensemble models in RFFs (#1530).
37+
* Deprecate `botorch.optim.numpy_converter` (#1191).
38+
* Deprecate `fit_gpytorch_scipy` and `fit_gpytorch_torch` (#1191).
39+
40+
#### Bug Fixes
41+
* Enforce use of float64 in `NdarrayOptimizationClosure` (#1508).
42+
* Replace deprecated np.bool with equivalent bool (#1524).
43+
* Fix RFF bug when using FixedNoiseGP models (#1528).
44+
45+
546
## [0.7.3] - Nov 10, 2022
647

748
### Highlights

scripts/run_tutorials.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,8 @@
2828
"thompson_sampling.ipynb", # very slow without KeOps + GPU
2929
"composite_mtbo.ipynb", # TODO: very slow, figure out if we can make it faster
3030
"Multi_objective_multi_fidelity_BO.ipynb", # TODO: very slow, speed up
31-
"composite_bo_with_hogp.ipynb", # TODO: OOMing the nightly cron, reduce memory usage.
31+
# TODO: OOMing the nightly cron, reduce memory usage.
32+
"composite_bo_with_hogp.ipynb",
3233
}
3334

3435

0 commit comments

Comments
 (0)