Skip to content

Commit d9442f7

Browse files
committed
Update tutorials and
readme
1 parent de7d399 commit d9442f7

10 files changed

+1126
-531
lines changed

README.md

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,8 @@
44
[![DOI](https://img.shields.io/badge/DOI-10.21105%2Fjoss.05702-blue?style=for-the-badge)](https://doi.org/10.21105/joss.05702)
55
![PyPI - License](https://img.shields.io/pypi/l/bayesflow?style=for-the-badge)
66

7-
BayesFlow is a Python library for simulation-based **Amortized Bayesian Inference** with neural networks.
8-
It provides users with:
7+
BayesFlow 2 is a Python library for simulation-based **Amortized Bayesian Inference** with neural networks.
8+
It provides users and researchers with:
99

1010
- A user-friendly API for rapid Bayesian workflows
1111
- A rich collection of neural network architectures
@@ -56,9 +56,8 @@ For an in-depth exposition, check out our walkthrough notebooks below. More tuto
5656
3. [Two moons starter example](examples/Two_Moons_Starter.ipynb)
5757
4. [Rapid iteration with point estimators](examples/Lotka_Volterra_point_estimation_and_expert_stats.ipynb)
5858
5. [SIR model with custom summary network](examples/SIR_Posterior_Estimation.ipynb)
59-
6. [Hyperparameter optimization](examples/Hyperparameter_Optimization.ipynb)
60-
7. [Bayesian experimental design](examples/Bayesian_Experimental_Design.ipynb)
61-
8. [Simple model comparison example](examples/One_Sample_TTest.ipynb)
59+
6. [Bayesian experimental design](examples/Bayesian_Experimental_Design.ipynb)
60+
7. [Simple model comparison example](examples/One_Sample_TTest.ipynb)
6261

6362
## Install
6463

@@ -126,9 +125,8 @@ Documentation is available at https://bayesflow.org. Please use the [BayesFlow F
126125

127126
You can cite BayesFlow along the lines of:
128127

129-
- We approximated the posterior with neural posterior estimation and learned summary statistics (NPE; Radev et al., 2020), as implemented in the BayesFlow software for amortized Bayesian workflows (Radev et al., 2023a).
130-
- We approximated the likelihood with neural likelihood estimation (NLE; Papamakarios et al., 2019) without hand-crafted summary statistics, as implemented in the BayesFlow software for amortized Bayesian workflows (Radev et al., 2023b).
131-
- We performed simultaneous posterior and likelihood estimation with jointly amortized neural approximation (JANA; Radev et al., 2023a), as implemented in the BayesFlow software for amortized Bayesian workflows (Radev et al., 2023b).
128+
- We approximated the posterior using neural posterior estimation (NPE) with learned summary statistics (Radev et al., 2020), as implemented in the BayesFlow framework for amortized Bayesian inference (Radev et al., 2023a).
129+
- We approximated the likelihood using neural likelihood estimation (NLE) without hand-crafted summary statistics (Papamakarios et al., 2019), leveraging its implementation in BayesFlow for efficient and flexible inference.
132130

133131
1. Radev, S. T., Schmitt, M., Schumacher, L., Elsemüller, L., Pratz, V., Schälte, Y., Köthe, U., & Bürkner, P.-C. (2023a). BayesFlow: Amortized Bayesian workflows with neural networks. *The Journal of Open Source Software, 8(89)*, 5702.([arXiv](https://arxiv.org/abs/2306.16015))([JOSS](https://joss.theoj.org/papers/10.21105/joss.05702))
134132
2. Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., Köthe, U. (2020). BayesFlow: Learning complex stochastic models with invertible neural networks. *IEEE Transactions on Neural Networks and Learning Systems, 33(4)*, 1452-1466. ([arXiv](https://arxiv.org/abs/2003.06281))([IEEE TNNLS](https://ieeexplore.ieee.org/document/9298920))
@@ -169,6 +167,10 @@ You can cite BayesFlow along the lines of:
169167
}
170168
```
171169

170+
## Awesome Amortized Inference
171+
172+
If you are interested in a curated list of resources, including reviews, software, papers, and other resources related to amortized inference, feel free to explore our [community-driven list](https://github.com/bayesflow-org/awesome-amortized-inference).
173+
172174
## Acknowledgments
173175

174176
This project is currently managed by researchers from Rensselaer Polytechnic Institute, TU Dortmund University, and Heidelberg University. It is partially funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation, Project 528702768). The project is further supported by Germany's Excellence Strategy -- EXC-2075 - 390740016 (Stuttgart Cluster of Excellence SimTech) and EXC-2181 - 390900948 (Heidelberg Cluster of Excellence STRUCTURES), as well as the Informatics for Life initiative funded by the Klaus Tschira Foundation.

examples/Bayesian_Experimental_Design.ipynb

Lines changed: 18 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,12 @@
1515
"In this tutorial, we will:\n",
1616
"\n",
1717
"- Introduce the core concepts in static (i.e. non-adaptive) BED\n",
18-
"- Show how to use BayesFlow to learn the optimal designs for a simple chemical reaction system\n",
18+
"- Show how to use low-level BayesFlow features to learn the optimal designs for a simple chemical reaction system\n",
1919
"- Demonstrate the multi-backend capabilities of BayesFlow\n",
2020
"- Use PyTorch components (model, optimisers, etc.) via Keras3 through BayesFlow\n",
21-
"- Use a normalizing flow from BayesFlow with a custom loss function that's written in pure PyTorch\n"
21+
"- Use a normalizing flow from BayesFlow with a custom loss function that's written in pure PyTorch\n",
22+
"\n",
23+
"This tutorial is aimed at intermediate / advanced users who want to create custom training loops and capitalize on the stable generative network implementations."
2224
]
2325
},
2426
{
@@ -90,20 +92,24 @@
9092
"import torch.distributions as dist\n",
9193
"import torch.nn as nn\n",
9294
"\n",
93-
"# import keras\n",
94-
"from keras.src.backend.common import global_state\n",
95-
"\n",
96-
"global_state.set_global_attribute(\"torch_device\", \"cpu\")\n",
97-
"\n",
9895
"# for BayesFlow devs: this ensures that the latest dev version can be found\n",
9996
"# this is not required if you have BayesFlow installed (e.g., via pip)\n",
10097
"import sys\n",
10198
"sys.path.append(\"../\")\n",
10299
"\n",
103100
"import bayesflow as bf\n",
104-
"from cmdstanpy import CmdStanModel\n",
105101
"\n",
102+
"from cmdstanpy import CmdStanModel"
103+
]
104+
},
105+
{
106+
"cell_type": "code",
107+
"execution_count": null,
108+
"metadata": {},
109+
"outputs": [],
110+
"source": [
106111
"import logging\n",
112+
"\n",
107113
"# disable the long printouts from stan.\n",
108114
"logger = logging.getLogger(\"cmdstanpy\")\n",
109115
"logger.addHandler(logging.NullHandler())\n",
@@ -533,10 +539,11 @@
533539
],
534540
"source": [
535541
"# sample from the amortised posterior\n",
542+
"num_samples = 2000\n",
536543
"amortised_posterior_samples = torch.stack(\n",
537544
" [\n",
538-
" posterior_net.sample((2000,), conditions=torch.cat(\n",
539-
" [yy.unsqueeze(0).expand(2000, -1), simulator.designs.expand(2000, -1)], dim=-1\n",
545+
" posterior_net.sample(num_samples, conditions=torch.cat(\n",
546+
" [yy.unsqueeze(0).expand(num_samples, -1), simulator.designs.expand(num_samples, -1)], dim=-1\n",
540547
" ),\n",
541548
" )\n",
542549
" for yy in test_sims[\"scaled_y\"]\n",
@@ -701,7 +708,7 @@
701708
"designed_amortised_samples = torch.stack(\n",
702709
" [\n",
703710
" posterior_net_designs.sample(\n",
704-
" (2000,), \n",
711+
" batch_shape=2000, \n",
705712
" conditions=torch.cat(\n",
706713
" [yy.unsqueeze(0).expand(2000, -1), simulator.designs.expand(2000, -1)], dim=-1\n",
707714
" ),\n",

examples/From_ABC_to_BayesFlow.ipynb

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,7 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7-
"# From pyABC to BayesFlow\n",
8-
"## Markov Jump Process: Reaction Network\n",
7+
"# From ABC to BayesFlow\n",
98
"_Author: Jonas Arruda_"
109
]
1110
},
@@ -17,7 +16,7 @@
1716
"source": [
1817
"In the following, we fit stochastic chemical reaction kinetics with Approximate Bayesian Computation (ABC) and show how to transfer the workflow to `BayesFlow`. Any model written in an ABC framework can be easily transferred to `BayesFlow`. The main difference is that `BayesFlow` enables *amortized inference*, allowing us to instantly infer parameters from new data sets without further training.\n",
1918
"\n",
20-
"In ABC, inference is tailored for a single dataset. For both we need to specify a simulator and priors over the parameters. While in ABC we also need to specify a distance function between observation and simulation (and for higher dimensional problems also summary statistics), in `BayesFlow` we only need to specify an adapter that maps the simulator output to the output of a neural `Approximator`. After training, we can validate the computational faithfulness of the trained approximator using various Bayesian metrics, such as simulation-based calibration (SBC). This is typically not feasible with ABC, as we would have to repeat the same expensive approximation loop for every new data set.\n",
19+
"In ABC (and non-amortized inference in general), estimation is tailored for a single dataset. For both we need to specify a simulator and priors over the parameters. While in ABC we also need to specify a distance function between observation and simulation (and for higher dimensional problems also summary statistics), in `BayesFlow` we only need to specify an adapter that maps the simulator output to the output of a neural `Approximator`. After training, we can validate the computational faithfulness of the trained approximator using various Bayesian metrics, such as simulation-based calibration (SBC). This is typically not feasible with ABC, as we would have to repeat the same expensive approximation loop for every new data set.\n",
2120
"\n",
2221
"For this example, we need to install [pyABC](https://github.com/icb-dcm/pyabc), which is our go-to library for ABC. The example is taken from [the documentation](https://pyabc.readthedocs.io/en/latest/examples/chemical_reaction.html). This tutorial starts with the ABC implementation and then demonstrates how to easily switch to BayesFlow. Readers familiar with pyABC can fast-forward to the second part of the notebook."
2322
]

examples/Linear_Regression_Starter.ipynb

Lines changed: 5 additions & 65 deletions
Original file line numberDiff line numberDiff line change
@@ -26,31 +26,6 @@
2626
"At a high level, our architecture consists of a summary network $\\mathbf{h}$ and an inference network $\\mathbf{f}$ which jointly learn to invert a generative model. The summary network transforms input data $\\mathbf{x}$ of potentially variable size to a fixed-length representations. The inference network generates random draws from an approximate posterior $\\mathbf{q}$ via a conditional generative networks (here, an invertible network)."
2727
]
2828
},
29-
{
30-
"cell_type": "markdown",
31-
"metadata": {},
32-
"source": [
33-
"## Setup\n",
34-
"\n",
35-
"For this notebook to run, you need to have the latest bayesflow dev version installed,\n",
36-
"for example via:"
37-
]
38-
},
39-
{
40-
"cell_type": "code",
41-
"execution_count": 1,
42-
"metadata": {
43-
"ExecuteTime": {
44-
"end_time": "2025-02-14T10:51:27.527747Z",
45-
"start_time": "2025-02-14T10:51:27.525961Z"
46-
},
47-
"id": "JMj_DpOSvFJw"
48-
},
49-
"outputs": [],
50-
"source": [
51-
"# !pip install git+https://github.com/bayesflow-org/bayesflow.git@dev"
52-
]
53-
},
5429
{
5530
"cell_type": "markdown",
5631
"metadata": {
@@ -77,47 +52,12 @@
7752
"\n",
7853
"if \"KERAS_BACKEND\" not in os.environ:\n",
7954
" # set this to \"torch\", \"tensorflow\", or \"jax\"\n",
80-
" os.environ[\"KERAS_BACKEND\"] = \"jax\""
81-
]
82-
},
83-
{
84-
"cell_type": "code",
85-
"execution_count": 3,
86-
"metadata": {
87-
"ExecuteTime": {
88-
"end_time": "2025-02-14T10:51:27.658247Z",
89-
"start_time": "2025-02-14T10:51:27.656660Z"
90-
}
91-
},
92-
"outputs": [],
93-
"source": [
94-
"# for BayesFlow devs: this ensures that the latest dev version can be found\n",
95-
"import sys\n",
96-
"sys.path.append('../')"
97-
]
98-
},
99-
{
100-
"cell_type": "code",
101-
"execution_count": 4,
102-
"metadata": {
103-
"ExecuteTime": {
104-
"end_time": "2025-02-14T10:51:29.016951Z",
105-
"start_time": "2025-02-14T10:51:27.698656Z"
106-
}
107-
},
108-
"outputs": [],
109-
"source": [
110-
"import keras\n",
55+
" os.environ[\"KERAS_BACKEND\"] = \"jax\"\n",
56+
"\n",
11157
"import matplotlib.pyplot as plt\n",
112-
"import numpy as np"
113-
]
114-
},
115-
{
116-
"cell_type": "code",
117-
"execution_count": 5,
118-
"metadata": {},
119-
"outputs": [],
120-
"source": [
58+
"import numpy as np\n",
59+
"\n",
60+
"import keras\n",
12161
"import bayesflow as bf"
12262
]
12363
},

examples/Lotka_Volterra_point_estimation_and_expert_stats.ipynb

Lines changed: 6 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -5,24 +5,13 @@
55
"id": "3cacfcfd-d638-4181-9c16-ba050ab5e367",
66
"metadata": {},
77
"source": [
8-
"# Rapid iteration with point estimation and expert statistics for Lotka-Volterra dynamics\n",
8+
"# Rapid Iteration with Point Estimation Lotka-Volterra Dynamics\n",
99
"\n",
10-
"_Authors: Hans Olischläger_\n",
10+
"_Author: Hans Olischläger_\n",
1111
"\n",
1212
"In this notebook, we will infer parameters of a famous ecology differential equation with BayesFlow.\n",
1313
"\n",
14-
"We will follow a typical workflow that emphazises rapid iterations early on, before building up towards reliable estimates of the full posterior."
15-
]
16-
},
17-
{
18-
"cell_type": "code",
19-
"execution_count": 1,
20-
"id": "498100c2-9169-4f58-a37e-684aaf32ea45",
21-
"metadata": {},
22-
"outputs": [],
23-
"source": [
24-
"%load_ext autoreload\n",
25-
"%autoreload 2"
14+
"We will follow a typical workflow that emphazises rapid iterations early on, before building up towards reliable estimates of the full posterior with end-to-end data embedding."
2615
]
2716
},
2817
{
@@ -51,25 +40,14 @@
5140
"import matplotlib.pyplot as plt\n",
5241
"import numpy as np\n",
5342
"import seaborn as sns\n",
43+
"\n",
5444
"from scipy.integrate import odeint\n",
5545
"\n",
56-
"# For BayesFlow devs: this ensures that the latest dev version can be found\n",
57-
"import sys\n",
58-
"sys.path.append('../')\n",
46+
"import keras\n",
5947
"\n",
6048
"import bayesflow as bf"
6149
]
6250
},
63-
{
64-
"cell_type": "code",
65-
"execution_count": 3,
66-
"id": "18b0496b-87e4-46cb-9f51-54ae67b1b9c6",
67-
"metadata": {},
68-
"outputs": [],
69-
"source": [
70-
"import keras"
71-
]
72-
},
7351
{
7452
"cell_type": "code",
7553
"execution_count": 4,
@@ -125,7 +103,7 @@
125103
"metadata": {},
126104
"outputs": [],
127105
"source": [
128-
"rng = np.random.RandomState(seed=1234) # for reproducibility of the simulations"
106+
"rng = np.random.default_rng(seed=1234)"
129107
]
130108
},
131109
{

examples/One_Sample_TTest.ipynb

Lines changed: 4 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -6,28 +6,16 @@
66
"source": [
77
"# Simple Model Comparison - One Sample T-Test\n",
88
"\n",
9-
"_Authors: Šimon Kucharský_"
9+
"_Author: Šimon Kucharský_"
1010
]
1111
},
1212
{
1313
"cell_type": "markdown",
1414
"metadata": {},
1515
"source": [
16-
"In this notebook, we will show how to do a simple model comparison in bayesflow, amortized over the number of observations."
17-
]
18-
},
19-
{
20-
"cell_type": "code",
21-
"execution_count": 1,
22-
"metadata": {
23-
"vscode": {
24-
"languageId": "powershell"
25-
}
26-
},
27-
"outputs": [],
28-
"source": [
29-
"# Use the latest version of bayesflow\n",
30-
"# !pip install git+https://github.com/bayesflow-org/bayesflow.git@dev"
16+
"In this notebook, we will show how to do a simple model comparison in BayesFlow amortized over the number of observations.\n",
17+
"\n",
18+
"Amortized Bayesian model comparison leverages neural networks to learn a mapping from data to posterior model probabilities, effectively bypassing the need for costly inference procedures for each new dataset. This method is particularly useful in scenarios where model evaluation needs to be performed repeatedly, as the inference cost is front-loaded into the training phase, enabling rapid comparisons at test time."
3119
]
3220
},
3321
{

0 commit comments

Comments
 (0)