Skip to content

Commit bb6b4bf

Browse files
Balandatfacebook-github-bot
authored andcommitted
Various fixes to the website (#137)
Summary: Includes typos, broken links, styling. Pull Request resolved: #137 Reviewed By: danielrjiang Differential Revision: D15159210 Pulled By: Balandat fbshipit-source-id: b4d03f30a8c2b8f13b65f19e6ccb4815b15e55de
1 parent d2f29f5 commit bb6b4bf

File tree

6 files changed

+89
-72
lines changed

6 files changed

+89
-72
lines changed

docs/botorch_and_ax.md

Lines changed: 15 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -3,14 +3,15 @@ id: botorch_and_ax
33
title: Using BoTorch with Ax
44
---
55

6-
[Ax](https://github.com/facebook/Ax) is a platform for sequential
7-
experimentation. It relies on BoTorch for implementing Bayesian Optimization
8-
algorithms, but provides higher-level APIs that make it easy and convenient to
9-
specify problems, visualize results, and benchmark new algorithms.
6+
[Ax](https://ax.dev) is a platform for sequential experimentation. It relies on
7+
BoTorch for implementing Bayesian Optimization algorithms, but provides
8+
higher-level APIs that make it easy and convenient to specify problems,
9+
visualize results, and benchmark new algorithms.
1010
It also comes with powerful metadata management, storage of results, and
1111
deployment-related APIs. Ax makes it convenient to use BoTorch in most standard
1212
Bayesian Optimization settings.
13-
Simply put, BoTorch provides the building blocks for the engine, while Ax makes it easy to drive the car.
13+
Simply put, BoTorch provides the building blocks for the engine, while Ax makes
14+
it easy to drive the car.
1415

1516

1617
![BoTorch and Ax](assets/botorch_and_ax.svg)
@@ -35,9 +36,9 @@ optimization researcher, such as keeping track of results, and transforming
3536
inputs and outputs to ranges that will ensure sensible handling in (G)PyTorch.
3637
The functionality provided by Ax should apply to most standard use cases.
3738

38-
Even if you want something more custom, it may still be easier to use the Ax framework.
39-
For instance, say you want to experiment with using a different kind of
40-
surrogate model, or a new type of acquisition function, but leave the rest of
39+
Even if you want something more custom, it may still be easier to use the Ax
40+
framework. For instance, say you want to experiment with using a different kind
41+
of surrogate model, or a new type of acquisition function, but leave the rest of
4142
the the Bayesian Optimization loop untouched. It is then straightforward to plug
4243
your custom BoTorch model or acquisition function into Ax to take advantage of
4344
Ax's various loop control APIs, as well as its powerful automated metadata
@@ -58,13 +59,14 @@ Optimization loop in BoTorch. The
5859
this can be done.
5960

6061
You may also consider working purely in BoTorch if you want to be able to
61-
understand and control every single aspect of your BayesOpt loop - Ax's simplicity
62-
necessarily means that certain powerful BoTorch features will not be fully exposed to the user.
62+
understand and control every single aspect of your BayesOpt loop - Ax's
63+
simplicity necessarily means that certain powerful BoTorch features will not be
64+
fully exposed to the user.
6365

6466

6567
## Prototyping in BoTorch
6668

6769
The modular design of BoTorch makes it very easy to prototype and debug
68-
individual components in an interactive fashion in a Jupyter notebook just like you might do with PyTorch.
69-
Once these building blocks have been designed and tested, they can easily
70-
be integrated into Ax.
70+
individual components in an interactive fashion in a Jupyter notebook just like
71+
you might do with PyTorch. Once these building blocks have been designed and
72+
tested, they can easily be integrated into Ax.

docs/design_philosophy.md

Lines changed: 19 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@ BoTorch adheres to the following main design tenets:
2525

2626
## Parallelism Through Batched Computations
2727

28-
Batching (as in batching data or batching computations) is a central component to
29-
all modern deep learning platforms and plays a critical role in the design of
28+
Batching (as in batching data or batching computations) is a central component
29+
to all modern deep learning platforms and plays a critical role in the design of
3030
BoTorch. Examples of batch computations in BoTorch include:
3131

3232
1. A batch of candidate points $X$ to be evaluated in parallel on the black-box
@@ -35,10 +35,11 @@ BoTorch. Examples of batch computations in BoTorch include:
3535
2. A batch of q-batches to be evaluated in parallel on the surrogate model of
3636
the black-box function. These facilitate fast evaluation on modern hardware
3737
such as GPUs and multi-core CPUs with advanced instruction sets (e.g. AVX).
38-
In BoTorch, we refer to a batch of this type as **"t-batch"** (as in "torch-batch").
39-
3. A **batched** surrogate **model**, each batch of which models a different output
40-
(which is useful for multi-objective Bayesian Optimization). This kind of
41-
batching also aims to exploit modern hardware architecture.
38+
In BoTorch, we refer to a batch of this type as **"t-batch"** (as in
39+
"torch-batch").
40+
3. A **batched** surrogate **model**, each batch of which models a different
41+
output (which is useful for multi-objective Bayesian Optimization). This kind
42+
of batching also aims to exploit modern hardware architecture.
4243

4344
Note that none of these notions of batch pertains to the batching of *training
4445
data*, which is commonly done in training Neural Network models (sometimes
@@ -48,12 +49,15 @@ stochastic gradient descent using mini-batch training, BoTorch itself abstracts
4849
away from this.
4950

5051
For an in-depth look at the different batch notions in BoTorch, take a look at
51-
the [Batching in BoTorch](#batching) section.
52+
the [Batching in BoTorch](batching) section.
5253

5354

5455
## Optimizing Acquisition Functions
5556

56-
While BoTorch tries to align as closely as possible with PyTorch when possible, optimization of acquisition functions requires a somewhat different approach. We now describe this discrepancy and explain in detail why we made this design decision.
57+
While BoTorch tries to align as closely as possible with PyTorch when possible,
58+
optimization of acquisition functions requires a somewhat different approach.
59+
We now describe this discrepancy and explain in detail why we made this design
60+
decision.
5761

5862
In PyTorch, modules typically map (batches of) data to an output, where the
5963
mapping is parameterized by the parameters of the modules (often the weights
@@ -80,20 +84,20 @@ optimizing a model with these algorithms is by extracting the module's
8084
parameters (e.g. using `parameters()`), and writing a manual optimization loop
8185
that calls `step()` on a torch `Optimizer` object.
8286

83-
Optimizing acquisition functions is different since the problem
84-
dimensionality is often much smaller. Indeed, optimizing over $q$ design points in a
87+
Optimizing acquisition functions is different since the problem dimensionality
88+
is often much smaller. Indeed, optimizing over $q$ design points in a
8589
$d$-dimensional feature space results in $qd$ scalar parameters to optimize
8690
over. Both $q$ and $d$ are often quite small, and hence so is the dimensionality
8791
of the problem.
8892
Moreover, the optimization problem can be cast as a deterministic one (either
8993
because an analytic acquisition function is used, or because the
9094
reparameterization trick is employed to render the Monte-Carlo-based evaluation
9195
of the acquisition function deterministic in terms of the input tensor $X$).
92-
As a result, optimization algorithms that are typically inadmissible for problems
93-
such as training Neural Networks become promising alternatives to standard
94-
first-order methods. In particular, this includes quasi-second order methods
95-
(such as L-BFGS or SLSQP) that approximate local curvature of the acquisition
96-
function by using past gradient information.
96+
As a result, optimization algorithms that are typically inadmissible for
97+
problems such as training Neural Networks become promising alternatives to
98+
standard first-order methods. In particular, this includes quasi-second order
99+
methods (such as L-BFGS or SLSQP) that approximate local curvature of the
100+
acquisition function by using past gradient information.
97101
These methods are currently not well supported in the `torch.optim` package,
98102
which is why BoTorch provides a custom interface that wraps the optimizers from
99103
the `scipy.optimize` module.

docs/getting_started.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
id: getting_started
3-
title: Getting started
3+
title: Getting Started
44
---
55

66
This section shows you how to get your feet wet with BoTorch.
@@ -39,7 +39,7 @@ on GitHub.
3939

4040
## Basic Components
4141

42-
Here's a quick run down of the main components of a Bayesian optimization loop.
42+
Here's a quick run down of the main components of a Bayesian Optimization loop.
4343

4444
1. Fit a Gaussian Process model to data
4545
```python

docs/introduction.md

Lines changed: 19 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,12 @@ id: introduction
33
title: Introduction
44
---
55

6-
BoTorch (pronounced like "blow-torch") is a library for [Bayesian optimization](https://en.wikipedia.org/wiki/Bayesian_optimization)
7-
research built on top of [PyTorch](https://pytorch.org/), and is part of the PyTorch ecosystem.
6+
BoTorch (pronounced like "blow-torch") is a library for
7+
[Bayesian Optimization](https://en.wikipedia.org/wiki/Bayesian_optimization)
8+
research built on top of [PyTorch](https://pytorch.org/), and is part of the
9+
PyTorch ecosystem.
810

9-
Bayesian optimization (BayesOpt) is an established technique for sequential
11+
Bayesian Optimization (BayesOpt) is an established technique for sequential
1012
optimization of costly-to-evaluate black-box functions. It can be applied to a
1113
wide variety of problems, including hyperparameter optimization for machine
1214
learning algorithms, A/B testing, as well as many scientific and engineering
@@ -15,19 +17,21 @@ problems.
1517
BoTorch is best used in tandem with [Ax](https://ax.dev), Facebook's open-source
1618
adaptive experimentation platform, which provides an easy-to-use interface for
1719
defining, managing and running sequential experiments, while handling
18-
(meta-)data management, transformations, and systems integration. Users who just want an easy-to-use suite for Bayesian optimization [should start with Ax](https://ax.dev/docs/bayesopt).
20+
(meta-)data management, transformations, and systems integration. Users who just
21+
want an easy-to-use suite for Bayesian Optimization
22+
[should start with Ax](https://ax.dev/docs/bayesopt).
1923

2024

2125
## Why BoTorch?
2226

2327
### Improved Developer Efficiency
2428

2529
BoTorch provides a modular and easily extensible interface for composing
26-
Bayesian optimization primitives, including probabilistic models, acquisition
30+
Bayesian Optimization primitives, including probabilistic models, acquisition
2731
functions, and optimizers.
2832

2933
It significantly improves developer efficiency by utilizing quasi-Monte-Carlo
30-
acquisition functions (by ways of the "re-parameterization trick"
34+
acquisition functions (by way of the "re-parameterization trick"
3135
[^AutoEncVarBayes], [^ReparamAcq]), which makes it straightforward to implement
3236
new ideas without having to impose restrictive assumptions about the underlying
3337
model. Specifically, it avoids pen and paper math to derive analytic expressions
@@ -39,19 +43,20 @@ rich multi-output models with multiple correlated outcomes.
3943
BoTorch follows the same modular design philosophy as PyTorch, which makes it
4044
very easy for users to swap out or rearrange individual components in order to
4145
customize all aspects of their algorithm, thereby empowering researchers to do
42-
state-of-the art research on modern Bayesian optimization methods.
46+
state-of-the art research on modern Bayesian Optimization methods.
4347

4448

4549
### State-of-the-art Modeling
4650

47-
Bayesian optimization traditionally relies heavily on Gaussian Process (GP)
51+
Bayesian Optimization traditionally relies heavily on Gaussian Process (GP)
4852
models, which provide well-calibrated uncertainty estimates. BoTorch provides
4953
first-class support for state-of-the art probabilistic models in
5054
[GPyTorch](https://gpytorch.ai), a library for efficient and scalable GPs
51-
implemented in PyTorch (and to which the BoTorch authors have significantly contributed).
55+
implemented in PyTorch (and to which the BoTorch authors have significantly
56+
contributed).
5257
This includes support for multi-task GPs, deep kernel learning, deep GPs, and
5358
approximate inference. This enables using GP models for problems that have
54-
traditionally not been amenable to Bayesian optimization techniques.
59+
traditionally not been amenable to Bayesian Optimization techniques.
5560

5661
In addition, BoTorch's lightweight APIs are model-agnostic (they can for example
5762
work with [Pyro](http://pyro.ai) models), and support optimization of
@@ -76,7 +81,7 @@ optimization of acquisition functions operating on differentiable models.
7681

7782
### Bridging the Gap Between Research and Production
7883

79-
BoTorch implements modular building blocks for modern Bayesian optimization.
84+
BoTorch implements modular building blocks for modern Bayesian Optimization.
8085
It bridges the gap between research and production by being a very flexible
8186
research framework, but at the same time, a reliable, production-grade
8287
implementation that integrates well with other higher-level platforms,
@@ -86,13 +91,13 @@ specifically [Ax](https://ax.dev).
8691
## Target Audience
8792

8893
The primary audience for hands-on use of BoTorch are researchers and
89-
sophisticated practitioners in Bayesian optimization and AI.
94+
sophisticated practitioners in Bayesian Optimization and AI.
9095

9196
We recommend using BoTorch as a low-level API for implementing new algorithms
9297
for Ax. Ax has been designed to be an easy-to-use platform for end-users, which
93-
at the same time is flexible enough for Bayesian optimization researchers to
98+
at the same time is flexible enough for Bayesian Optimization researchers to
9499
plug into for handling of feature transformations, (meta-)data management,
95-
storage, etc. See [Using BoTorch with Ax](../botorch_and_ax) for more details.
100+
storage, etc. See [Using BoTorch with Ax](botorch_and_ax) for more details.
96101

97102
We recommend that end-users who are not actively doing research on Bayesian
98103
Optimization simply use Ax.

website/pages/en/index.js

Lines changed: 29 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -113,35 +113,34 @@ class Index extends React.Component {
113113
const pre = "```";
114114
// Example for model fitting
115115
const modelFitCodeExample = `${pre}python
116-
>>> import torch
117-
>>> from botorch.models import SingleTaskGP
118-
>>> from botorch.fit import fit_gpytorch_model
119-
>>> from gpytorch.mlls import ExactMarginalLogLikelihood
120-
121-
>>> train_X = torch.rand(10, 2)
122-
>>> Y = 1 - torch.norm(train_X - 0.5, dim=-1) + 0.1 * torch.rand(10)
123-
>>> train_Y = (Y - Y.mean()) / Y.std()
124-
125-
>>> gp = SingleTaskGP(train_X, train_Y)
126-
>>> mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
127-
>>> fit_gpytorch_model(mll);
116+
import torch
117+
from botorch.models import SingleTaskGP
118+
from botorch.fit import fit_gpytorch_model
119+
from gpytorch.mlls import ExactMarginalLogLikelihood
120+
121+
train_X = torch.rand(10, 2)
122+
Y = 1 - torch.norm(train_X - 0.5, dim=-1) + 0.1 * torch.rand(10)
123+
train_Y = (Y - Y.mean()) / Y.std()
124+
125+
gp = SingleTaskGP(train_X, train_Y)
126+
mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
127+
fit_gpytorch_model(mll)
128128
`;
129129
// Example for defining an acquisition function
130130
const constrAcqFuncExample = `${pre}python
131-
>>> from botorch.acquisition import UpperConfidenceBound
131+
from botorch.acquisition import UpperConfidenceBound
132132
133-
>>> UCB = UpperConfidenceBound(gp, beta=0.1)
133+
UCB = UpperConfidenceBound(gp, beta=0.1)
134134
`;
135135
// Example for optimizing candidates
136136
const optAcqFuncExample = `${pre}python
137-
>>> from botorch.optim import joint_optimize
138-
139-
>>> bounds = torch.stack([torch.zeros(2), torch.ones(2)])
140-
>>> candidate = joint_optimize(
141-
UCB, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
142-
)
143-
>>> candidate
144-
tensor([0.4887, 0.5063])
137+
from botorch.optim import joint_optimize
138+
139+
bounds = torch.stack([torch.zeros(2), torch.ones(2)])
140+
candidate = joint_optimize(
141+
UCB, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
142+
)
143+
candidate # tensor([0.4887, 0.5063])
145144
`;
146145
//
147146
const QuickStart = () => (
@@ -152,19 +151,22 @@ tensor([0.4887, 0.5063])
152151
<Container>
153152
<ol>
154153
<li>
155-
Install BoTorch:
154+
<h4>Install BoTorch:</h4>
155+
<a>via conda (recommended):</a>
156+
<MarkdownBlock>{bash`conda install botorch -c pytorch`}</MarkdownBlock>
157+
<a>via pip:</a>
156158
<MarkdownBlock>{bash`pip install botorch`}</MarkdownBlock>
157159
</li>
158160
<li>
159-
Fit a model:
161+
<h4>Fit a model:</h4>
160162
<MarkdownBlock>{modelFitCodeExample}</MarkdownBlock>
161163
</li>
162164
<li>
163-
Construct an acquisition function:
165+
<h4>Construct an acquisition function:</h4>
164166
<MarkdownBlock>{constrAcqFuncExample}</MarkdownBlock>
165167
</li>
166168
<li>
167-
Optimize the acquisition function:
169+
<h4>Optimize the acquisition function:</h4>
168170
<MarkdownBlock>{optAcqFuncExample}</MarkdownBlock>
169171
</li>
170172
</ol>
@@ -192,7 +194,7 @@ tensor([0.4887, 0.5063])
192194
title: 'Built on PyTorch',
193195
},
194196
{
195-
content: 'Support for scalable GPs. Run code on multiple devices.',
197+
content: 'Support for scalable GPs via GPyTorch. Run code on multiple devices.',
196198
image: `${baseUrl}img/arrows_expanding_colored.svg`,
197199
imageAlign: 'top',
198200
title: 'Scalable',

website/static/css/custom.css

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,12 +69,16 @@ div.productShowcaseSection {
6969

7070
.productShowcaseSection > h2 {
7171
font-variant: small-caps;
72-
font-weight: 300;
72+
font-weight: 360;
7373
margin: 0px;
7474
padding: 0px;
7575
color: #1C60F7;
7676
}
7777

78+
.productShowcaseSection p {
79+
font-weight: 360;
80+
}
81+
7882
.productShowcaseSection div.container {
7983
padding: 40px 0px;
8084
}

0 commit comments

Comments
 (0)