Skip to content

Commit b97552d

Browse files
committed
Merge branch 'Future' of https://github.com/stefanradev93/BayesFlow into Future
2 parents effa62f + 5aa4289 commit b97552d

File tree

5 files changed

+36
-6
lines changed

5 files changed

+36
-6
lines changed

README.md

Lines changed: 34 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ The project documentation is available at <http://bayesflow.readthedocs.io>
1616
A cornerstone idea of amortized Bayesian inference is to employ generative neural networks for parameter estimation, model comparison, and model validation
1717
when working with intractable simulators whose behavior as a whole is too complex to be described analytically. The figure below presents a higher-level overview of neurally bootstrapped Bayesian inference.
1818

19-
![Overview](https://github.com/stefanradev93/BayesFlow/blob/Future/img/high_level_framework.png?raw=true)
19+
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/high_level_framework.png" width=80% height=80%>
2020

2121
## Parameter Estimation
2222

@@ -39,23 +39,51 @@ def simulator(theta, n_obs=50, scale=1.0):
3939

4040
# Then, we create our BayesFlow setup consisting of a summary and an inference network:
4141
summary_net = bf.networks.InvariantNetwork()
42-
inference_net = bf.networks.InvertibleNetwork(n_params=2)
42+
inference_net = bf.networks.InvertibleNetwork(num_params=2)
4343
amortizer = bf.amortizers.AmortizedPosterior(inference_net, summary_net)
4444

4545
# Next, we connect the `prior` with the `simulator` using a `GenerativeModel` wrapper:
4646
generative_model = bf.simulation.GenerativeModel(prior, simulator)
4747

4848
# Finally, we connect the networks with the generative model via a `Trainer` instance:
49-
trainer = bf.trainers.Trainer(network=amortizer, generative_model=generative_model)
49+
trainer = bf.trainers.Trainer(amortizer=amortizer, generative_model=generative_model)
5050

5151
# We are now ready to train an amortized posterior approximator. For instance, to run online training, we simply call:
5252
losses = trainer.train_online(epochs=10, iterations_per_epoch=500, batch_size=32)
53+
```
54+
55+
Before inference, we can use simulation-based calibration (SBC, https://arxiv.org/abs/1804.06788) to check the computational faithfulness of the model-amortizer combination:
56+
57+
```python
58+
fig = trainer.diagnose_sbc_histograms()
59+
```
60+
61+
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/showcase_sbc.png" width=65% height=65%>
62+
63+
Amortized inference on new (real or simulated) data is then easy and fast:
64+
65+
```python
66+
# Simulate 200 new data sets and generate 500 posterior draws per data set
67+
new_sims = trainer.configurator(generative_model(200))
68+
posterior_draws = amortizer.sample(new_sims, n_samples=500)
69+
```
70+
71+
We can then quickly inspect the parameter recoverability of the model:
5372

54-
# Amortized posterior inference on 100 new data sets is then fast and easy:
55-
new_data = generative_model(100)
56-
samples = amortizer.sample(new_data, n_samples=5000)
73+
```python
74+
fig = bf.diagnostics.plot_recovery(posterior_draws, new_sims['parameters'])
5775
```
5876

77+
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/showcase_recovery.png" width=65% height=65%>
78+
79+
Or we can look at single posteriors in relation to the prior:
80+
81+
```python
82+
fig = bf.diagnostics.plot_posterior_2d(posterior_draws[0], prior=generative_model.prior)
83+
```
84+
85+
<img src="https://github.com/stefanradev93/BayesFlow/blob/Future/img/showcase_posterior.png" width=45% height=45%>
86+
5987
### Further Reading
6088

6189
Coming soon...

bayesflow/trainers.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -152,6 +152,8 @@ def __init__(self, amortizer, generative_model=None, configurator=None, checkpoi
152152
self.replay_buffer = None
153153
self.optimizer = None
154154
self.default_lr = default_lr
155+
# Currently unused attribute
156+
self.lr_adjuster = None
155157

156158
# Checkpoint and helper classes settings
157159
self.max_to_keep = max_to_keep

img/showcase_posterior.png

84.1 KB
Loading

img/showcase_recovery.png

101 KB
Loading

img/showcase_sbc.png

21.5 KB
Loading

0 commit comments

Comments
 (0)