Skip to content

Commit 241404c

Browse files
committed
Updated text on prediction
1 parent bc22911 commit 241404c

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

examples/variational_inference/bayesian_neural_network_advi.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -384,7 +384,7 @@
384384
"source": [
385385
"Now that we trained our model, lets predict on the hold-out set using a posterior predictive check (PPC). We can use {func}`pymc.sample_posterior_predictive` to generate new data (in this case class predictions) from the posterior (sampled from the variational estimation).\n",
386386
"\n",
387-
"To predict on the entire test set (and not just the minibatches) we need to create a new model object that removes the minibatches, and predicts on the whole data set. Notice that we are using our fitted `trace` to sample from the posterior predictive distribution, using the posterior estimates from the original model. The {class}`Flat` distribution is just a placeholder to make the model work; the actual values are sampled from the posterior."
387+
"To predict on the entire test set (and not just the minibatches) we need to create a new model object that removes the minibatches. Notice that we are using our fitted `trace` to sample from the posterior predictive distribution, using the posterior estimates from the original model. There is no new inference here, we are just using the same model and the same posterior estimates to generate predictions. The {class}`Flat` distribution is just a placeholder to make the model work; the actual values are sampled from the posterior."
388388
]
389389
},
390390
{

examples/variational_inference/bayesian_neural_network_advi.myst.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,7 @@ trace = approx.sample(draws=5000)
205205

206206
Now that we trained our model, lets predict on the hold-out set using a posterior predictive check (PPC). We can use {func}`pymc.sample_posterior_predictive` to generate new data (in this case class predictions) from the posterior (sampled from the variational estimation).
207207

208-
To predict on the entire test set (and not just the minibatches) we need to create a new model object that removes the minibatches, and predicts on the whole data set. Notice that we are using our fitted `trace` to sample from the posterior predictive distribution, using the posterior estimates from the original model. The {class}`Flat` distribution is just a placeholder to make the model work; the actual values are sampled from the posterior.
208+
To predict on the entire test set (and not just the minibatches) we need to create a new model object that removes the minibatches. Notice that we are using our fitted `trace` to sample from the posterior predictive distribution, using the posterior estimates from the original model. There is no new inference here, we are just using the same model and the same posterior estimates to generate predictions. The {class}`Flat` distribution is just a placeholder to make the model work; the actual values are sampled from the posterior.
209209

210210
```{code-cell} ipython3
211211
def sample_posterior_predictive(X_test, Y_test, trace, n_hidden=5):

0 commit comments

Comments
 (0)