Skip to content

Commit f5adbfe

Browse files
Update README.md
1 parent 1b0bbd2 commit f5adbfe

File tree

1 file changed

+11
-20
lines changed

1 file changed

+11
-20
lines changed

README.md

Lines changed: 11 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -26,45 +26,36 @@ Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (2020). Baye
2626

2727
### Minimal Example
2828

29-
First, we define a simple 2D toy model with a Gaussian prior and a Gaussian simulator (likelihood):
3029
```python
3130
import numpy as np
3231
import bayesflow as bf
3332

33+
# First, we define a simple 2D toy model with a Gaussian prior and a Gaussian simulator (likelihood):
3434
def prior(D=2, mu=0., sigma=1.0):
3535
return np.random.default_rng().normal(loc=mu, scale=sigma, size=D)
3636

3737
def simulator(theta, n_obs=50, scale=1.0):
3838
return np.random.default_rng().normal(loc=theta, scale=scale, size=(n_obs, theta.shape[0]))
39-
```
4039

41-
Then, we create our BayesFlow setup consisting of a summary and an inference network:
42-
```python
40+
# Then, we create our BayesFlow setup consisting of a summary and an inference network:
4341
summary_net = bf.networks.InvariantNetwork()
4442
inference_net = bf.networks.InvertibleNetwork(n_params=2)
4543
amortizer = bf.amortizers.AmortizedPosterior(inference_net, summary_net)
46-
```
47-
Next, we create a generative model which connects the `prior` with the `simulator`:
48-
```python
44+
45+
# Next, we connect the `prior` with the `simulator` using a `GenerativeModel` wrapper:
4946
generative_model = bf.simulation.GenerativeModel(prior, simulator)
50-
```
5147

52-
Finally, we connect the networks with the generative model via a `Trainer` instance:
53-
```python
54-
trainer = bf.trainers.Trainer(
55-
network=amortizer,
56-
generative_model=generative_model
57-
)
58-
```
59-
We are now ready to train an amortized posterior approximator. For instance, to run online training, we simply call:
60-
```python
48+
# Finally, we connect the networks with the generative model via a `Trainer` instance:
49+
trainer = bf.trainers.Trainer(network=amortizer, generative_model=generative_model)
50+
51+
# We are now ready to train an amortized posterior approximator. For instance, to run online training, we simply call:
6152
losses = trainer.train_online(epochs=10, iterations_per_epoch=500, batch_size=32)
62-
```
63-
which performs online training for 10 epochs of 500 iterations (batch simulations with 32 simulations per batch). Amortized posterior inference on 100 new data sets is then fast and easy:
64-
```python
53+
54+
# Amortized posterior inference on 100 new data sets is then fast and easy:
6555
new_data = generative_model(100)
6656
samples = amortizer.sample(new_data, n_samples=5000)
6757
```
58+
6859
### Further Reading
6960

7061
Coming soon...

0 commit comments

Comments
 (0)