Skip to content

Commit 55c0d59

Browse files
committed
edit README
1 parent e326f12 commit 55c0d59

File tree

1 file changed

+88
-29
lines changed

1 file changed

+88
-29
lines changed

README.md

Lines changed: 88 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,106 +1,165 @@
11
# BayesFlow
22

3-
Welcome to the beta-version of our BayesFlow library for simulation-based Bayesian workflows.
3+
Welcome to our BayesFlow library for amortized simulation-based Bayesian inference.
4+
5+
For starters, check out some or our walk-through notebooks:
46

5-
For starters, check out the walk-through notebooks:
67
1. [Basic amortized posterior estimation](docs/source/tutorial_notebooks/Intro_Amortized_Posterior_Estimation.ipynb)
78
2. [Intermediate posterior estimation](docs/source/tutorial_notebooks/Covid19_Initial_Posterior_Estimation.ipynb)
89
3. [Posterior estimation for ODEs](docs/source/tutorial_notebooks/Linear%20ODE%20system.ipynb)
9-
4. Coming soon...
1010

1111
## Project Documentation
12+
1213
The project documentation is available at <http://bayesflow.readthedocs.io>
1314

1415
## Conceptual Overview
1516

16-
A cornerstone idea of amortized Bayesian inference is to employ generative neural networks for parameter estimation, model comparison, and model validation
17-
when working with intractable simulators whose behavior as a whole is too complex to be described analytically. The figure below presents a higher-level overview of neurally bootstrapped Bayesian inference.
17+
A cornerstone idea of amortized Bayesian inference is to employ generative
18+
neural networks for parameter estimation, model comparison, and model validation
19+
when working with intractable simulators whose behavior as a whole is too
20+
complex to be described analytically. The figure below presents a higher-level
21+
overview of neurally bootstrapped Bayesian inference.
1822

19-
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/high_level_framework.png" width=80% height=80%>
23+
<img src="img/high_level_framework.png" width=80% height=80%>
2024

2125
## Parameter Estimation
2226

23-
The BayesFlow approach for amortized parameter estimation is based on our paper:
27+
The original BayesFlow approach for amortized parameter estimation is based on our paper:
2428

25-
Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (2020). BayesFlow: Learning complex stochastic models with invertible neural networks. <em>IEEE Transactions on Neural Networks and Learning Systems</em>, available for free at: https://arxiv.org/abs/2003.06281.
29+
Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (2020).
30+
BayesFlow: Learning complex stochastic models with invertible neural networks.
31+
<em>IEEE Transactions on Neural Networks and Learning Systems</em>, available
32+
for free at: https://arxiv.org/abs/2003.06281.
33+
34+
However, since then, we have substantially extended the BayesFlow library such that
35+
it is now much more general and cleaner than what we describe in the above paper.
2636

2737
### Minimal Example
2838

2939
```python
3040
import numpy as np
3141
import bayesflow as bf
42+
```
3243

33-
# First, we define a simple 2D toy model with a Gaussian prior and a Gaussian simulator (likelihood):
34-
def prior(D=2, mu=0., sigma=1.0):
35-
return np.random.default_rng().normal(loc=mu, scale=sigma, size=D)
44+
To introduce you to the basic workflow of the library, let's consider
45+
a simple 2D Gaussian model, from which we want to obtain
46+
posterior inference. We assume a Gaussian simulator (likelihood)
47+
and a Gaussian prior for the means of the two components,
48+
which are our only model parameters in this example:
3649

50+
```python
3751
def simulator(theta, n_obs=50, scale=1.0):
3852
return np.random.default_rng().normal(loc=theta, scale=scale, size=(n_obs, theta.shape[0]))
53+
54+
def prior(D=2, mu=0., sigma=1.0):
55+
return np.random.default_rng().normal(loc=mu, scale=sigma, size=D)
56+
```
57+
58+
Then, we connect the `prior` with the `simulator` using a `GenerativeModel` wrapper:
59+
60+
```python
61+
generative_model = bf.simulation.GenerativeModel(prior, simulator)
62+
```
3963

40-
# Then, we create our BayesFlow setup consisting of a summary and an inference network:
64+
Next, we create our BayesFlow setup consisting of a summary and an inference network:
65+
66+
```python
4167
summary_net = bf.networks.InvariantNetwork()
4268
inference_net = bf.networks.InvertibleNetwork(num_params=2)
4369
amortizer = bf.amortizers.AmortizedPosterior(inference_net, summary_net)
70+
```
4471

45-
# Next, we connect the `prior` with the `simulator` using a `GenerativeModel` wrapper:
46-
generative_model = bf.simulation.GenerativeModel(prior, simulator)
72+
Finally, we connect the networks with the generative model via a `Trainer` instance:
4773

48-
# Finally, we connect the networks with the generative model via a `Trainer` instance:
74+
```python
4975
trainer = bf.trainers.Trainer(amortizer=amortizer, generative_model=generative_model)
76+
```
5077

51-
# We are now ready to train an amortized posterior approximator. For instance, to run online training, we simply call:
78+
We are now ready to train an amortized posterior approximator. For instance,
79+
to run online training, we simply call:
80+
81+
```python
5282
losses = trainer.train_online(epochs=10, iterations_per_epoch=500, batch_size=32)
5383
```
5484

55-
Before inference, we can use simulation-based calibration (SBC, https://arxiv.org/abs/1804.06788) to check the computational faithfulness of the model-amortizer combination:
85+
Before inference, we can use simulation-based calibration (SBC,
86+
https://arxiv.org/abs/1804.06788) to check the computational faithfulness of
87+
the model-amortizer combination:
5688

5789
```python
5890
fig = trainer.diagnose_sbc_histograms()
5991
```
6092

61-
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/showcase_sbc.png" width=65% height=65%>
93+
<img src="img/showcase_sbc.png" width=65% height=65%>
6294

63-
Amortized inference on new (real or simulated) data is then easy and fast:
95+
The histograms are roughly uniform and lie within the expected range for
96+
well-calibrated inference algorithms as indicated by the shaded gray areas.
97+
Accordingly, our amortizer seems to have converged to the intended target.
98+
99+
Amortized inference on new (real or simulated) data is then easy and fast.
100+
For example, we can simulate 200 new data sets and generate 500 posterior draws
101+
per data set:
64102

65103
```python
66-
# Simulate 200 new data sets and generate 500 posterior draws per data set
67104
new_sims = trainer.configurator(generative_model(200))
68105
posterior_draws = amortizer.sample(new_sims, n_samples=500)
69106
```
70107

71-
We can then quickly inspect the parameter recoverability of the model:
108+
We can then quickly inspect the how well the model can recover its parameters
109+
across the simulated data sets.
72110

73111
```python
74112
fig = bf.diagnostics.plot_recovery(posterior_draws, new_sims['parameters'])
75113
```
76114

77-
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/showcase_recovery.png" width=65% height=65%>
115+
<img src="img/showcase_recovery.png" width=65% height=65%>
78116

79-
Or we can look at single posteriors in relation to the prior:
117+
For any individual data set, we can also compare the parameters' posteriors with
118+
their corresponding priors:
80119

81120
```python
82121
fig = bf.diagnostics.plot_posterior_2d(posterior_draws[0], prior=generative_model.prior)
83122
```
84123

85-
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/showcase_posterior.png" width=45% height=45%>
124+
<img src="img/showcase_posterior.png" width=45% height=45%>
86125

87-
### Further Reading
126+
We see clearly how the posterior shrinks relative to the prior for both
127+
model parameters as a result of conditioning on the data.
88128

89-
Coming soon...
129+
### References and Further Reading
90130

91-
## Model Misspecification
131+
- Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (2020).
132+
BayesFlow: Learning complex stochastic models with invertible neural networks.
133+
<em>IEEE Transactions on Neural Networks and Learning Systems</em>, available
134+
for free at: https://arxiv.org/abs/2003.06281.
92135

93-
What if we are dealing with misspecified models? That is, how faithful is our amortized inference if the generative model is a poor representation of reality? A modified loss function optimizes the learned summary statistics towards a unit Gaussian and reliably detects model misspecification during inference time.
94136

95-
![Model Misspecification](https://github.com/stefanradev93/BayesFlow/blob/Future/docs/source/images/model_misspecification_amortized_sbi.png?raw=true)
137+
## Model Misspecification
96138

139+
What if we are dealing with misspecified models? That is, how faithful is our
140+
amortized inference if the generative model is a poor representation of reality?
141+
A modified loss function optimizes the learned summary statistics towards a unit
142+
Gaussian and reliably detects model misspecification during inference time.
97143

144+
![](docs/source/images/model_misspecification_amortized_sbi.png?raw=true)
98145

146+
### References and Further Reading
147+
148+
- Schmitt, M., Bürkner P. C., Köthe U., & Radev S. T. (2022). Detecting Model
149+
Misspecification in Amortized Bayesian Inference with Neural Networks. <em>ArXiv
150+
preprint</em>.
99151

100152
## Model Comparison
101153

102154
Coming soon...
103155

156+
### References and Further Reading
157+
158+
- Radev S. T., D’Alessandro M., Mertens U. K., Voss A., Köthe U., & Bürkner P.
159+
C. (2021). Amortized Bayesian Model Comparison with Evidental Deep Learning.
160+
<em>IEEE Transactions on Neural Networks and Learning Systems</em>.
161+
doi:10.1109/TNNLS.2021.3124052 available for free at: https://arxiv.org/abs/2004.10629
162+
104163
## Likelihood emulation
105164

106165
Coming soon...

0 commit comments

Comments
 (0)