Skip to content

Commit 27fbe47

Browse files
committed
more stuff on intro
1 parent 0e2387e commit 27fbe47

File tree

1 file changed

+11
-11
lines changed

1 file changed

+11
-11
lines changed

docs/source/notebooks/GP-introduction.ipynb

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@
120120
"\n",
121121
"Here the `lengthscales` parameter is two dimensional, each dimension can have different lengthscales. The reason we have to specify `input_dim`, the total number of columns of `X`, and `active_dims`, which of those columns or dimensions the covariance function will act on, is because `cov_func` hasn't actually seen the input data yet. The `active_dims` argument is optional, and defaults to all columns of the matrix of inputs. \n",
122122
"\n",
123-
"Covariance functions in PyMC3 closely follow the algebraic rules for kernels:\n",
123+
"Covariance functions in PyMC3 closely follow the algebraic rules for kernels, which allows users to combine covariance functions into new ones, for example:\n",
124124
"\n",
125125
"- The sum two covariance functions is also a covariance function.\n",
126126
"\n",
@@ -137,9 +137,7 @@
137137
" \n",
138138
" cov_func = eta**2 * pm.gp.cov.Matern32(...)\n",
139139
" \n",
140-
"- ...\n",
141-
"\n",
142-
"Like the `gp.*` objects, until the covariance functions are actually *evaluated* over a set of inputs, they are still a Python objects that aren't part of the model. To evaluate a covariance function and create an actual covariance matrix, call `cov_func(x, x)`, or `cov_func(x, x_new)`. "
140+
"For more information on combining covariance functions in PyMC3, check out the tutorial on covariance functions. Like the `gp.*` objects, until the covariance functions are actually *evaluated* over a set of inputs, they are still a Python objects that aren't part of the model. To evaluate a covariance function and create an actual covariance matrix, call `cov_func(x, x)`, or `cov_func(x, x_new)`. "
143141
]
144142
},
145143
{
@@ -148,16 +146,16 @@
148146
"source": [
149147
"# Example: `gp.Latent`\n",
150148
"\n",
151-
"The following is an example showing how to specify a simple model with a GP prior, and then sample from the posterior using NUTS. We build an example data set to use using a multivariate normal and known covariance function to generate the data so we can verify that the inference we perform is correct."
149+
"The following is an example showing how to specify a simple model with a GP prior, then sample from the posterior using NUTS. We build an example data with a draw from a GP, so we can verify that the inference we perform is correct."
152150
]
153151
},
154152
{
155153
"cell_type": "code",
156-
"execution_count": 1,
154+
"execution_count": 14,
157155
"metadata": {
158156
"ExecuteTime": {
159-
"end_time": "2017-08-04T21:00:54.024910Z",
160-
"start_time": "2017-08-04T21:00:53.091113Z"
157+
"end_time": "2017-08-05T00:30:53.357774Z",
158+
"start_time": "2017-08-05T00:30:53.348602Z"
161159
},
162160
"collapsed": true
163161
},
@@ -437,7 +435,9 @@
437435
"source": [
438436
"# Example: `gp.Marginal`\n",
439437
"\n",
440-
"There is a more efficient way to model the last example. Most GP introductions or tutorials describe the scenario we just covered -- regression with IID Gaussian noise. This is a special case, but is the most common GP model that people use. Here there is no need to explicitly include the unknown function values as latent variables because $\\mathbf{f}_x$ can be integrated out analytically. The product of the GP prior probability distribution with a normal likelihood is also normal, and is called the *marginal likelihood*. Including the prior on the hyperparameters of the covariance function, we can write the *marginal posterior* as\n",
438+
"There is a more efficient way to model the last example. Most GP introductions or tutorials describe the scenario we just covered -- regression with IID Gaussian noise. This is the most common GP model that people use, but it's really a special case. When the noise is Gaussian there is no need to explicitly include $\\mathbf{f}_x$ as latent variables because it can be integrated out analytically. \n",
439+
"\n",
440+
"As mentioned before, the product of the GP prior probability distribution with a normal likelihood is also normal. It's called the *marginal likelihood*. If we including the prior on the hyperparameters of the covariance function, we can write the *marginal posterior* as\n",
441441
"\n",
442442
"$$\n",
443443
"p(y \\mid x, \\theta)p(\\theta) = \\int p(y \\mid f, x, \\theta) \\, p(f \\mid x, \\theta) \\,\n",
@@ -453,9 +453,9 @@
453453
" - \\frac{n}{2}\\log (2 \\pi) + \\log p(\\theta)\n",
454454
"$$\n",
455455
"\n",
456-
"The first term penalizes lack of fit, the second term penalizes model complexity via the determinant of $K_{xx}$. The third term is just a constant. The final term is the log-prior of the covariance function hyperparameters. \n",
456+
"The first term penalizes lack of fit, the second term penalizes model complexity via the determinant of $K_{xx}$. The third term is just a constant. The final term on the right is the log-prior of the covariance function hyperparameters. \n",
457457
"\n",
458-
"We repeat the previous example using `gp.Marginal` instead. The code to specify this equivalent model is a little bit different that before. Notice that `gp.marginal_likelihood` subsumes both the GP prior and the Normal likelihood of the observed data, `y`. Also, since we are using the marginal likelihood, it is possible to use `find_MAP` to quickly get the value at the mode of the covariance function hyperparameters. "
458+
"The code to specify this equivalent model using `gp.Marginal` is a little bit different that before. The `gp.marginal_likelihood` subsumes both the GP prior and the Normal likelihood of the observed data, `y`. Also, since we are using the marginal likelihood, it is possible to use `find_MAP` to quickly get the value at the mode of the covariance function hyperparameters. "
459459
]
460460
},
461461
{

0 commit comments

Comments
 (0)