Skip to content

Commit 1b0bbd2

Browse files
Update README.md
1 parent b23aceb commit 1b0bbd2

File tree

1 file changed

+18
-12
lines changed

1 file changed

+18
-12
lines changed

README.md

Lines changed: 18 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -26,38 +26,44 @@ Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (2020). Baye
2626

2727
### Minimal Example
2828

29+
First, we define a simple 2D toy model with a Gaussian prior and a Gaussian simulator (likelihood):
2930
```python
31+
import numpy as np
32+
import bayesflow as bf
33+
3034
def prior(D=2, mu=0., sigma=1.0):
3135
return np.random.default_rng().normal(loc=mu, scale=sigma, size=D)
3236

3337
def simulator(theta, n_obs=50, scale=1.0):
3438
return np.random.default_rng().normal(loc=theta, scale=scale, size=(n_obs, theta.shape[0]))
3539
```
3640

41+
Then, we create our BayesFlow setup consisting of a summary and an inference network:
3742
```python
38-
summary_net = InvariantNetwork()
39-
inference_net = InvertibleNetwork({'n_params': 10})
40-
amortizer = AmortizedPosterior(inference_net, summary_net)
43+
summary_net = bf.networks.InvariantNetwork()
44+
inference_net = bf.networks.InvertibleNetwork(n_params=2)
45+
amortizer = bf.amortizers.AmortizedPosterior(inference_net, summary_net)
4146
```
42-
Next, we define a generative model which connects a *prior* (a function returning random draws from the prior distribution over parameters) with a *simulator* (a function accepting the prior draws as arguments) and returning a simulated data set with *n_obs* potentially multivariate observations.
47+
Next, we create a generative model which connects the `prior` with the `simulator`:
4348
```python
44-
generative_model = GenerativeModel(prior, simulator)
49+
generative_model = bf.simulation.GenerativeModel(prior, simulator)
4550
```
46-
Finally, we connect the networks with the generative model via a trainer instance:
51+
52+
Finally, we connect the networks with the generative model via a `Trainer` instance:
4753
```python
48-
trainer = Trainer(
54+
trainer = bf.trainers.Trainer(
4955
network=amortizer,
5056
generative_model=generative_model
5157
)
5258
```
53-
We are now ready to train an amortized posterior approximator. For instance, to run online training, we simply call
59+
We are now ready to train an amortized posterior approximator. For instance, to run online training, we simply call:
5460
```python
55-
losses = trainer.train_online(epochs=50, iterations_per_epoch=1000, batch_size=64)
61+
losses = trainer.train_online(epochs=10, iterations_per_epoch=500, batch_size=32)
5662
```
57-
which performs online training for 50 epochs of 1000 iterations (batch simulations with 64 simulations per batch). See the [tutorial notebooks](docs/source/tutorial_notebooks) for more examples. Posterior inference is then fast and easy:
63+
which performs online training for 10 epochs of 500 iterations (batch simulations with 32 simulations per batch). Amortized posterior inference on 100 new data sets is then fast and easy:
5864
```python
59-
# Obtain 5000 samples from the posterior given obs_data
60-
samples = amortizer.sample(obs_data, n_samples=5000)
65+
new_data = generative_model(100)
66+
samples = amortizer.sample(new_data, n_samples=5000)
6167
```
6268
### Further Reading
6369

0 commit comments

Comments
 (0)