You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Next, we define a generative model which connects a *prior* (a function returning random draws from the prior distribution over parameters) with a *simulator* (a function accepting the prior draws as arguments) and returning a simulated data set with *n_obs* potentially multivariate observations.
47
+
Next, we create a generative model which connects the `prior`with the `simulator`:
which performs online training for 50 epochs of 1000 iterations (batch simulations with 64 simulations per batch). See the [tutorial notebooks](docs/source/tutorial_notebooks) for more examples. Posterior inference is then fast and easy:
63
+
which performs online training for 10 epochs of 500 iterations (batch simulations with 32 simulations per batch). Amortized posterior inference on 100 new data sets is then fast and easy:
58
64
```python
59
-
# Obtain 5000 samples from the posterior given obs_data
0 commit comments