Skip to content

Commit 15a6f7a

Browse files
committed
updating docs
1 parent 1058be1 commit 15a6f7a

File tree

4 files changed

+382
-390
lines changed

4 files changed

+382
-390
lines changed

docs/source/gp.rst

Lines changed: 23 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ three predictor variables::
7777
ls = [2, 5] # the lengthscales
7878
cov_func = pm.gp.cov.ExpQuad(input_dim=3, ls=ls, active_dims=[1, 2])
7979

80-
Here the :code:`ls` parameter is two dimensional, allowing the second
80+
Here the :code:`ls`, or lengthscale, parameter is two dimensional, allowing the second
8181
and third dimension to have a different lengthscale. The reason we have to
8282
specify :code:`input_dim`, the total number of columns of :code:`X`, and
8383
:code:`active_dims`, which of those columns or dimensions the covariance
@@ -152,16 +152,7 @@ variable representing the function we are placing the prior over.
152152
The second argument is the inputs to the function that the prior is over,
153153
:code:`X`. The inputs are usually known and present in the data, but they can
154154
also be PyMC3 random variables. If the inputs are a Theano tensor or a
155-
PyMC3 random variable, the :code:`n_points` argument is required.
156-
157-
.. note::
158-
159-
The :code:`n_points` argument is sometimes required due to how Theano and
160-
PyMC3 handle the shape information of distributions. It needs to be specified
161-
when :code:`X` or :code:`Xnew` is a Theano tensor or a random variable.
162-
For :code:`prior` or :code:`marginal_likelihood`, it is the number of rows
163-
of the inputs, :code:`X`. For :code:`conditional`, it is the number of
164-
rows in the new inputs, :code:`X_new`.
155+
PyMC3 random variable, the :code:`shape` needs to be given.
165156

166157
Usually at this point, inference is performed on the model. The
167158
:code:`conditional` method creates the conditional, or predictive,
@@ -214,8 +205,8 @@ with the contribution due to :math:`f_1 + f_2` factored out, we get
214205
K_1^{**} - K_1^{*^T}(K_1 + K_2)^{-1}K_1^* \right) \,.
215206
216207
217-
So one can break down GP models into individual components to examine how each
218-
contribute to the data. For more information, check out `David Duvenaud's PhD
208+
These equations show how to break down GP models into individual components to see how each
209+
contributes to the data. For more information, check out `David Duvenaud's PhD
219210
thesis <https://www.cs.toronto.edu/~duvenaud/thesis.pdf>`_.
220211

221212
The GP objects in PyMC3 keeps track of these marginals automatically. The
@@ -225,8 +216,8 @@ other implementations. The first block fits the GP prior. We denote
225216
:math:`f_1 + f_2` as just :math:`f` for brevity::
226217

227218
with pm.Model() as model:
228-
gp1 = pm.gp.Latent(mean_func1, cov_func1)
229-
gp2 = pm.gp.Latent(mean_func2, cov_func2)
219+
gp1 = pm.gp.Marginal(mean_func1, cov_func1)
220+
gp2 = pm.gp.Marginal(mean_func2, cov_func2)
230221
231222
# gp represents f1 + f2.
232223
gp = gp1 + gp2
@@ -236,33 +227,33 @@ other implementations. The first block fits the GP prior. We denote
236227
trace = pm.sample(1000)
237228

238229

239-
The second block produces conditional distributions, including :math:`f_2^*
240-
\mid f_1 + f_2`. Notice that extra arguments are required for conditionals of
241-
:math:`f1` and :math:`f2`, but not :math:`f`. This is because those arguments
242-
are cached when calling :code:`.marginal_likelihood`.
243-
244230
To construct the conditional distribution of :code:`gp1` or :code:`gp2`, we
245231
also need to include the additional arguments, :code:`X`, :code:`y`, and
246232
:code:`noise`::
247233

248234
with model:
249235
# conditional distributions of f1 and f2
250-
f1_star = gp1.conditional("f1_star", X_star, X=X, y=y, noise=noise)
251-
f2_star = gp2.conditional("f2_star", X_star, X=X, y=y, noise=noise)
236+
f1_star = gp1.conditional("f1_star", X_star,
237+
given={"X": X, "y": y, "noise": noise, "gp": gp})
238+
f2_star = gp2.conditional("f2_star", X_star,
239+
given={"X": X, "y": y, "noise": noise, "gp": gp})
252240

253-
# conditional of f1 + f2, additional kwargs not required
241+
# conditional of f1 + f2, `given` not required
254242
f_star = gp.conditional("f_star", X_star)
255243

244+
This second block produces the conditional distributions. Notice that extra
245+
arguments are required for conditionals of :math:`f1` and :math:`f2`, but not
246+
:math:`f`. This is because those arguments are cached when calling
247+
:code:`.marginal_likelihood` was called on :code:`gp`.
248+
256249
.. note::
257-
The additional arguments :code:`X`, :code:`y`, and :code:`noise` must be
258-
provided as **keyword arguments**!
259-
260-
The :code:`gp` object keeps track of the inputs it used when :code:`marginal_likelihood`
261-
or :code:`prior` was set. Since the marginal likelihoood method of :code:`gp1` or
262-
:code:`gp2` weren't called, their conditionals need to be provided with the required
263-
inputs. In the same fashion as the prior, :code:`f_star`, :code:`f1_star` and
264-
:code:`f2_star` are random variables that can now be used like any other random
265-
variable in PyMC3.
250+
When constructing conditionals, the additional arguments :code:`X`, :code:`y`,
251+
:code:`noise` and :code:`gp` must be provided as a dict called `given`!
252+
253+
Since the marginal likelihoood method of :code:`gp1` or :code:`gp2` weren't called,
254+
their conditionals need to be provided with the required inputs. In the same
255+
fashion as the prior, :code:`f_star`, :code:`f1_star` and :code:`f2_star` are random
256+
variables that can now be used like any other random variable in PyMC3.
266257

267258
Check the notebooks for detailed demonstrations of the usage of GP functionality
268259
in PyMC3.

0 commit comments

Comments
 (0)