File tree Expand file tree Collapse file tree 1 file changed +18
-3
lines changed Expand file tree Collapse file tree 1 file changed +18
-3
lines changed Original file line number Diff line number Diff line change @@ -50,10 +50,25 @@ trainer = bf.trainers.Trainer(amortizer=amortizer, generative_model=generative_m
5050
5151# We are now ready to train an amortized posterior approximator. For instance, to run online training, we simply call:
5252losses = trainer.train_online(epochs = 10 , iterations_per_epoch = 500 , batch_size = 32 )
53+ ```
5354
54- # Amortized posterior inference on 100 new data sets is then fast and easy:
55- new_data = generative_model(100 )
56- samples = amortizer.sample(new_data, n_samples = 5000 )
55+ Before inference, we can use simulation-based calibration (SBC, https://arxiv.org/abs/1804.06788 ) to check the computational faithfulness of the model-amortizer combination:
56+ ``` python
57+ fig = trainer.diagnose_sbc_histograms(plot_args = dict (param_names = [r ' $ \t heta_1$ ' , r ' $ \t heta_2$ ' ]))
58+ ```
59+ Amortized inference on new (real or simulated) data is then easy and fast:
60+ ``` python
61+ # Simulate 200 new data sets and generate 500 posterior draws per data set
62+ new_sims = trainer.configurator(generative_model(200 ))
63+ posterior_draws = amortizer.sample(new_sims, n_samples = 500 )
64+ ```
65+ We can then quickly inspect the parameter recoverability of the model
66+ ``` python
67+ fig = bf.diagnostics.plot_recovery(posterior_draws, new_sims[' parameters' ])
68+ ```
69+ or look at single posteriors in relation to the prior:
70+ ``` python
71+ fig = bf.diagnostics.plot_posterior_2d(posterior_draws[0 ], prior = generative_model.prior)
5772```
5873
5974### Further Reading
You can’t perform that action at this time.
0 commit comments