Skip to content

Commit b87a64f

Browse files
Update README.md [skip ci]
1 parent 115148b commit b87a64f

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@ def prior_m2(D=2, mu=2., sigma=1.0):
193193
return np.random.default_rng().normal(loc=mu, scale=sigma, size=D)
194194
```
195195

196-
We create both models as before and use a `MultiGenerativeModel` wrapper to combine them in a `meta_model`:
196+
For the purpose of this illustration, the two toy models only differ with respect to their prior specification ($M_1: \mu = 0, M_2: \mu = 2$). We create both models as before and use a `MultiGenerativeModel` wrapper to combine them in a `meta_model`:
197197

198198
```python
199199
model_m1 = bf.simulation.GenerativeModel(prior_m1, simulator, simulator_is_batched=False)
@@ -228,7 +228,7 @@ When feeding the data to our trained network, we almost immediately obtain poste
228228
model_probs = amortizer.posterior_probs(sims)
229229
```
230230

231-
How good are these predicted probabilities? We can have a look at the calibration:
231+
How good are these predicted probabilities in the closed world? We can have a look at the calibration:
232232

233233
```python
234234
cal_curves = bf.diagnostics.plot_calibration_curves(sims["model_indices"], model_probs)

0 commit comments

Comments
 (0)