Skip to content

Commit 7b8fe7e

Browse files
committed
Revamp tutorial
1 parent 1960be3 commit 7b8fe7e

File tree

4 files changed

+240
-686
lines changed

4 files changed

+240
-686
lines changed

examples/Covid19_Initial_Posterior_Estimation.ipynb

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@
5757
"\n",
5858
"In this tutorial, we will illustrate how to perform posterior inference on simple, stationary SIR-like models (complex models will be tackled in a further notebook). SIR-like models comprise suitable illustrative examples, since they generate time-series and their outputs represent the results of solving a system of ordinary differential equations (ODEs).\n",
5959
"\n",
60-
"The details for tackling stochastic epidemiological models are described in our corresponding paper, which you can consult for a more formal exposition and a more comprehensive treatment of neural architectures:\n",
60+
"The details for tackling stochastic epidemiological models with neural networks are described in our corresponding paper, which you can consult for a more formal exposition and a more comprehensive treatment of neural architectures:\n",
6161
"\n",
6262
"<em>OutbreakFlow: Model-based Bayesian inference of disease outbreak dynamics with invertible neural networks and its application to the COVID-19 pandemics in Germany</em> https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1009472"
6363
]
@@ -69,7 +69,7 @@
6969
"metadata": {},
7070
"outputs": [],
7171
"source": [
72-
"RNG = np.random.default_rng(2022)"
72+
"RNG = np.random.default_rng(2023)"
7373
]
7474
},
7575
{
@@ -148,7 +148,7 @@
148148
"outputs": [],
149149
"source": [
150150
"def model_prior():\n",
151-
" \"\"\"Generates random draws from the prior.\"\"\"\n",
151+
" \"\"\"Generates a random draw from the joint prior.\"\"\"\n",
152152
"\n",
153153
" lambd = RNG.lognormal(mean=np.log(0.4), sigma=0.5)\n",
154154
" mu = RNG.lognormal(mean=np.log(1 / 8), sigma=0.2)\n",
@@ -173,11 +173,11 @@
173173
"id": "retained-namibia",
174174
"metadata": {},
175175
"source": [
176-
"During training, we will also standardize the prior draws, that is, ensure zero means and unit scale. We will do this purely for technical reasons - neural networks like scaled values. In addition, our current prior ranges differ vastly, so each parameter will contribute disproportionately to the loss function.\n",
176+
"During training, we will also standardize the prior draws, that is, ensure zero location and unit scale. We will do this purely for technical reasons - neural networks like scaled values. In addition, our current prior ranges differ vastly, so each parameter axis may contribute disproportionately to the loss function.\n",
177177
"\n",
178178
"Here, we will use the `estimate_means_and_stds()` method of a `Prior` instance, which will estimate the prior means and standard deviations from random draws. We could have also just taken the analytic means and standard deviations, but these may not be available in all settings (e.g., implicit priors).\n",
179179
"\n",
180-
"<strong>Caution:</strong> Make sure you have a seed or you set a seed whenever you are doing a Monte-Carlo estimation, since your results might differ slightly due to the empirical variation of the estimates!"
180+
"<strong>Caution:</strong> Make sure you have a seed or you set a seed whenever you are doing Monte Carlo estimation, since your results might differ slightly due to the empirical variation of the estimates!"
181181
]
182182
},
183183
{
@@ -215,7 +215,7 @@
215215
"\n",
216216
"def convert_params(mu, phi):\n",
217217
" \"\"\"Helper function to convert mean/dispersion parameterization of a negative binomial to N and p,\n",
218-
" as expected by numpy.\n",
218+
" as expected by numpy's negative_binomial.\n",
219219
"\n",
220220
" See https://en.wikipedia.org/wiki/Negative_binomial_distribution#Alternative_formulations\n",
221221
" \"\"\"\n",
@@ -227,7 +227,7 @@
227227
"\n",
228228
"\n",
229229
"def stationary_SIR(params, N, T, eps=1e-5):\n",
230-
" \"\"\"Performs a forward simulation from the stationary SIR model given a random draw from the prior,\"\"\"\n",
230+
" \"\"\"Performs a forward simulation from the stationary SIR model given a random draw from the prior.\"\"\"\n",
231231
"\n",
232232
" # Extract parameters and round I0 and D\n",
233233
" lambd, mu, D, I0, psi = params\n",

examples/Intro_Amortized_Posterior_Estimation.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -49,8 +49,8 @@
4949
"\n",
5050
"Welcome to the very first tutorial on using <strong>BayesFlow</strong> for amortized posterior estimation! In this notebook, we will estimate the means of a multivariate Gaussian model and illustrate some features of the library along the way. Above, we have already imported the core entities we will need for this notebook. In brief:\n",
5151
"\n",
52-
"* The module `simulations` contains high-level wrappers for gluing together priors, simulators, and context generators into a single `GenerateModel` object, which will generate all quantities of interest for a modeling scenario.\n",
53-
"* The module `networks` contains the core neural architectures used for various tasks, e.g., a `DeepSet` for realizing normalizing flows (https://paperswithcode.com/method/normalizing-flows) or a `DeepSet` for learning permutation-invariant summary representations (embeddings).\n",
52+
"* The module `simulation` contains high-level wrappers for gluing together priors, simulators, and context generators into a single `GenerativeModel` object, which will generate all quantities of interest for a modeling scenario.\n",
53+
"* The module `networks` contains the core neural architectures used for various tasks, e.g., an `InvariantNetwork` for realizing normalizing flows (https://paperswithcode.com/method/normalizing-flows) or a `DeepSet` for learning permutation-invariant summary representations (embeddings).\n",
5454
"* The module `amortizers` contains high-level wrappers which connect the various networks together and instruct them about their particular goals in the inference pipeline.\n",
5555
"* The module `trainers` contains high-level wrappers for dictating the <em>training phase</em> of an amortized posterior. Typically, the standard `Trainer` will take care of most scenarios.\n",
5656
"\n",
@@ -89,7 +89,7 @@
8989
"id": "biological-alpha",
9090
"metadata": {},
9191
"source": [
92-
"First and foremost, we set a local seed for reproducibility (best practice as of 2022)."
92+
"First and foremost, we set a local seed for reproducibility (best `numpy` practice as of 2022)."
9393
]
9494
},
9595
{

examples/Linear_ODE_system.ipynb

Lines changed: 224 additions & 659 deletions
Large diffs are not rendered by default.

examples/Model_Misspecification.ipynb

Lines changed: 6 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
},
1717
"source": [
1818
"<h1>Table of Contents<span class=\"tocSkip\"></span></h1>\n",
19-
"<div class=\"toc\"><ul class=\"toc-item\"><li><span><a href=\"#Introduction\" data-toc-modified-id=\"Introduction-1\"><span class=\"toc-item-num\">1&nbsp;&nbsp;</span>Introduction</a></span></li><li><span><a href=\"#Model-specification\" data-toc-modified-id=\"Model-specification-2\"><span class=\"toc-item-num\">2&nbsp;&nbsp;</span>Model specification</a></span></li><li><span><a href=\"#Training\" data-toc-modified-id=\"Training-3\"><span class=\"toc-item-num\">3&nbsp;&nbsp;</span>Training</a></span><ul class=\"toc-item\"><li><span><a href=\"#Training-loop\" data-toc-modified-id=\"Training-loop-3.1\"><span class=\"toc-item-num\">3.1&nbsp;&nbsp;</span>Training loop</a></span></li><li><span><a href=\"#Diagnostics\" data-toc-modified-id=\"Diagnostics-3.2\"><span class=\"toc-item-num\">3.2&nbsp;&nbsp;</span>Diagnostics</a></span></li><li><span><a href=\"#Inspecting-the-summary-space\" data-toc-modified-id=\"Inspecting-the-summary-space-3.3\"><span class=\"toc-item-num\">3.3&nbsp;&nbsp;</span>Inspecting the summary space</a></span></li></ul></li><li><span><a href=\"#Observed-Data:-Misspecification-Detection\" data-toc-modified-id=\"Observed-Data:-Misspecification-Detection-4\"><span class=\"toc-item-num\">4&nbsp;&nbsp;</span>Observed Data: Misspecification Detection</a></span><ul class=\"toc-item\"><li><span><a href=\"#Visualization-in-data-space\" data-toc-modified-id=\"Visualization-in-data-space-4.1\"><span class=\"toc-item-num\">4.1&nbsp;&nbsp;</span>Visualization in data space</a></span></li><li><span><a href=\"#Detecting-misspecification-in-summary-space\" data-toc-modified-id=\"Detecting-misspecification-in-summary-space-4.2\"><span class=\"toc-item-num\">4.2&nbsp;&nbsp;</span>Detecting misspecification in summary space</a></span></li></ul></li><li><span><a href=\"#Hypothesis-test-for-observed-data\" data-toc-modified-id=\"Hypothesis-test-for-observed-data-5\"><span class=\"toc-item-num\">5&nbsp;&nbsp;</span>Hypothesis test for observed data</a></span></li><li><span><a href=\"#Sensitivity-to-Misspecification\" data-toc-modified-id=\"Sensitivity-to-Misspecification-6\"><span class=\"toc-item-num\">6&nbsp;&nbsp;</span>Sensitivity to Misspecification</a></span><ul class=\"toc-item\"><li><span><a href=\"#Computing-Sesntivity\" data-toc-modified-id=\"Computing-Sesntivity-6.1\"><span class=\"toc-item-num\">6.1&nbsp;&nbsp;</span>Computing sensitivity</a></span></li><li><span><a href=\"#Plotting-the-results\" data-toc-modified-id=\"Plotting-the-results-6.2\"><span class=\"toc-item-num\">6.2&nbsp;&nbsp;</span>Plotting the results</a></span></li></ul></li></ul></div>"
19+
"<div class=\"toc\"><ul class=\"toc-item\"><li><span><a href=\"#Detecting-Model-Misspecification-in-Amortized-Posterior-Inference\" data-toc-modified-id=\"Detecting-Model-Misspecification-in-Amortized-Posterior-Inference-1\"><span class=\"toc-item-num\">1&nbsp;&nbsp;</span>Detecting Model Misspecification in Amortized Posterior Inference</a></span><ul class=\"toc-item\"><li><span><a href=\"#Introduction\" data-toc-modified-id=\"Introduction-1.1\"><span class=\"toc-item-num\">1.1&nbsp;&nbsp;</span>Introduction</a></span></li><li><span><a href=\"#Model-specification\" data-toc-modified-id=\"Model-specification-1.2\"><span class=\"toc-item-num\">1.2&nbsp;&nbsp;</span>Model specification</a></span></li><li><span><a href=\"#Training\" data-toc-modified-id=\"Training-1.3\"><span class=\"toc-item-num\">1.3&nbsp;&nbsp;</span>Training</a></span><ul class=\"toc-item\"><li><span><a href=\"#Training-loop\" data-toc-modified-id=\"Training-loop-1.3.1\"><span class=\"toc-item-num\">1.3.1&nbsp;&nbsp;</span>Training loop</a></span></li><li><span><a href=\"#Diagnostics\" data-toc-modified-id=\"Diagnostics-1.3.2\"><span class=\"toc-item-num\">1.3.2&nbsp;&nbsp;</span>Diagnostics</a></span></li><li><span><a href=\"#Inspecting-the-summary-space\" data-toc-modified-id=\"Inspecting-the-summary-space-1.3.3\"><span class=\"toc-item-num\">1.3.3&nbsp;&nbsp;</span>Inspecting the summary space</a></span></li></ul></li><li><span><a href=\"#Observed-Data:-Misspecification-Detection\" data-toc-modified-id=\"Observed-Data:-Misspecification-Detection-1.4\"><span class=\"toc-item-num\">1.4&nbsp;&nbsp;</span>Observed Data: Misspecification Detection</a></span><ul class=\"toc-item\"><li><span><a href=\"#Visualization-in-data-space\" data-toc-modified-id=\"Visualization-in-data-space-1.4.1\"><span class=\"toc-item-num\">1.4.1&nbsp;&nbsp;</span>Visualization in data space</a></span></li><li><span><a href=\"#Detecting-misspecification-in-summary-space\" data-toc-modified-id=\"Detecting-misspecification-in-summary-space-1.4.2\"><span class=\"toc-item-num\">1.4.2&nbsp;&nbsp;</span>Detecting misspecification in summary space</a></span></li></ul></li><li><span><a href=\"#Hypothesis-test-for-observed-data\" data-toc-modified-id=\"Hypothesis-test-for-observed-data-1.5\"><span class=\"toc-item-num\">1.5&nbsp;&nbsp;</span>Hypothesis test for observed data</a></span></li><li><span><a href=\"#Sensitivity-to-Misspecification\" data-toc-modified-id=\"Sensitivity-to-Misspecification-1.6\"><span class=\"toc-item-num\">1.6&nbsp;&nbsp;</span>Sensitivity to Misspecification</a></span><ul class=\"toc-item\"><li><span><a href=\"#Computing-Sensitivity\" data-toc-modified-id=\"Computing-Sensitivity-1.6.1\"><span class=\"toc-item-num\">1.6.1&nbsp;&nbsp;</span>Computing Sensitivity</a></span></li><li><span><a href=\"#Plotting-the-results\" data-toc-modified-id=\"Plotting-the-results-1.6.2\"><span class=\"toc-item-num\">1.6.2&nbsp;&nbsp;</span>Plotting the results</a></span></li></ul></li></ul></li></ul></div>"
2020
]
2121
},
2222
{
@@ -26,9 +26,6 @@
2626
"metadata": {},
2727
"outputs": [],
2828
"source": [
29-
"import os\n",
30-
"import sys\n",
31-
"\n",
3229
"import matplotlib\n",
3330
"import matplotlib.cm as cm\n",
3431
"import matplotlib.pyplot as plt\n",
@@ -138,7 +135,7 @@
138135
"\n",
139136
"\n",
140137
"def simulator(theta, n_obs=100, scale=1.0):\n",
141-
" \"\"\"Gaussian likelihood random number generator\"\"\"\n",
138+
" \"\"\"Gaussian likelihood random number generator.\"\"\"\n",
142139
" return np.random.default_rng().normal(loc=theta, scale=scale, size=(n_obs, theta.shape[0]))\n",
143140
"\n",
144141
"\n",
@@ -164,9 +161,9 @@
164161
"The Inference network is a standard `InvertibleNetwork` with two coupling layers and the `AmortizedPosterior` combines the inference and summary networks. Since we desire model misspecification detection via a structured summary space [2], we select `summary_loss_fun=\"MMD\"` and the amortizer combines its losses correctly.\n",
165162
"Finally, the `trainer` wraps the generative model and the amortizer into a consistent object for training and subsequent sampling.\n",
166163
"\n",
167-
"[1] Zaheer et al (2017): https://arxiv.org/abs/1703.06114\n",
164+
"[1] Zaheer et al. (2017): https://arxiv.org/abs/1703.06114\n",
168165
"\n",
169-
"[2] Schmitt et al (2022): https://arxiv.org/abs/2112.08866"
166+
"[2] Schmitt et al. (2022): https://arxiv.org/abs/2112.08866"
170167
]
171168
},
172169
{
@@ -888,19 +885,11 @@
888885
"metadata": {},
889886
"outputs": [],
890887
"source": []
891-
},
892-
{
893-
"cell_type": "code",
894-
"execution_count": null,
895-
"id": "07fabafb",
896-
"metadata": {},
897-
"outputs": [],
898-
"source": []
899888
}
900889
],
901890
"metadata": {
902891
"kernelspec": {
903-
"display_name": "Python 3 (ipykernel)",
892+
"display_name": "Python 3",
904893
"language": "python",
905894
"name": "python3"
906895
},
@@ -914,7 +903,7 @@
914903
"name": "python",
915904
"nbconvert_exporter": "python",
916905
"pygments_lexer": "ipython3",
917-
"version": "3.10.11"
906+
"version": "3.9.13"
918907
},
919908
"toc": {
920909
"base_numbering": 1,

0 commit comments

Comments
 (0)