Skip to content

Commit ff75f2e

Browse files
committed
update doc
1 parent c65944c commit ff75f2e

23 files changed

+451
-372
lines changed

.github/workflows/docs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,7 @@ jobs:
2828
uses: actions/setup-python@v4
2929
with:
3030
python-version: 3.11
31+
cache: 'pip'
3132

3233
- name: Install dependencies
3334
run: |

.gitignore

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,10 @@ __pycache__/
44
*/__pycache__/
55
projects/
66
*/bayesflow.egg-info
7-
docsrc/build/
7+
docsrc/_build/
88
build
9+
docs/
910

10-
# Notebooks
11-
docsrc/source/tutorial_notebooks/**
1211

1312
# mypy
1413
.mypy_cache
@@ -31,3 +30,6 @@ docsrc/source/tutorial_notebooks/**
3130

3231
# tox
3332
.tox
33+
34+
# MacOS
35+
.DS_Store

CONTRIBUTING.md

Lines changed: 24 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
Contributing to BayesFlow
2-
==========
2+
=========================
33

44
Workflow
55
--------
@@ -65,3 +65,26 @@ You can run the all tests locally via:
6565
Or a specific test via:
6666

6767
pytest -e test_[mytest]
68+
69+
Tutorial Notebooks
70+
------------------
71+
72+
New tutorial notebooks are always welcome! You can add your tutorial notebook file to `examples/` and add a reference
73+
to the list of notebooks in `docsrc/source/examples.rst`.
74+
Re-build the documentation (see below) and your notebook will be included.
75+
76+
Documentation
77+
-------------
78+
79+
The documentation uses [sphinx](https://www.sphinx-doc.org/) and relies on [numpy style docstrings](https://numpydoc.readthedocs.io/en/latest/format.html) in classes and functions.
80+
The overall *structure* of the documentation is manually designed. This also applies to the API documentation. This has two implications for you:
81+
82+
1. If you add to existing submodules, the documentation will update automatically (given that you use proper numpy docstrings).
83+
2. If you add a new submodule or subpackage, you need to add a file to `docsrc/source/api` and a reference to the new module to the appropriate section of `docsrc/source/api/bayesflow.rst`.
84+
85+
You can re-build the documentation with
86+
87+
cd docsrc/
88+
make clean && make github
89+
90+
The entry point of the rendered documentation will be at `docs/index.html`.

bayesflow/amortizers.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1103,8 +1103,8 @@ def sample(self, input_dict, n_samples, to_numpy=True, **kwargs):
11031103
**kwargs : dict, optional, default: {}
11041104
Additional keyword arguments passed to the summary network as the amortizers
11051105
1106-
Returns:
1107-
--------
1106+
Returns
1107+
-------
11081108
samples_dict : dict
11091109
A dictionary with keys `global_samples` and `local_samples`
11101110
Local samples will hold an array-like of shape (num_replicas, num_samples, num_local)

bayesflow/computational_utilities.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -59,8 +59,8 @@ def posterior_calibration_error(
5959
max_quantile : float in (0, 1), optional, default: 0.995
6060
The maximum posterior quantile to consider
6161
62-
Returns:
63-
--------
62+
Returns
63+
-------
6464
calibration_errors : np.ndarray of shape (num_params, ) or (alpha_resolution, num_params),
6565
if ``aggregator_fun is None``.
6666
The aggregated calibration error per marginal posterior.
@@ -248,8 +248,8 @@ def expected_calibration_error(m_true, m_pred, num_bins=10):
248248
Obtaining well calibrated probabilities using bayesian binning.
249249
In Proceedings of the AAAI conference on artificial intelligence (Vol. 29, No. 1).
250250
251-
Important
252-
---------
251+
Notes
252+
-----
253253
Make sure that ``m_true`` are **one-hot encoded** classes!
254254
255255
Parameters

bayesflow/coupling_networks.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -601,8 +601,8 @@ def call(self, target_or_z, condition, inverse=False, **kwargs):
601601
target : tf.Tensor
602602
If inverse=True: The back-transformed z, shape (batch_size, inp_dim)
603603
604-
Important
605-
---------
604+
Notes
605+
-----
606606
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
607607
If ``inverse=True``, the return is ``target``
608608
"""

bayesflow/helper_networks.py

Lines changed: 21 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -292,24 +292,30 @@ def call(self, inputs):
292292
class ActNorm(tf.keras.Model):
293293
"""Implements an Activation Normalization (ActNorm) Layer.
294294
Activation Normalization is learned invertible normalization, using
295-
a Scale (s) and Bias (b) vector [1].
296-
y = s * x + b (forward)
297-
x = (y - b)/s (inverse)
295+
a Scale (s) and Bias (b) vector::
298296
299-
The scale and bias can be data dependent initalized, such that the
300-
output has a mean of zero and standard deviation of one [1,2].
297+
y = s * x + b (forward)
298+
x = (y - b)/s (inverse)
299+
300+
Notes
301+
-----
302+
303+
The scale and bias can be data dependent initialized, such that the
304+
output has a mean of zero and standard deviation of one [1]_[2]_.
301305
Alternatively, it is initialized with vectors of ones (scale) and
302306
zeros (bias).
303307
304-
[1] - Kingma, Diederik P., and Prafulla Dhariwal.
305-
"Glow: Generative flow with invertible 1x1 convolutions."
306-
arXiv preprint arXiv:1807.03039 (2018).
308+
References
309+
----------
310+
311+
.. [1] Kingma, Diederik P., and Prafulla Dhariwal.
312+
"Glow: Generative flow with invertible 1x1 convolutions."
313+
arXiv preprint arXiv:1807.03039 (2018).
307314
308-
[2] - Salimans, Tim, and Durk P. Kingma.
309-
"Weight normalization: A simple reparameterization to accelerate
310-
training of deep neural networks."
311-
Advances in neural information processing systems 29
312-
(2016): 901-909.
315+
.. [2] Salimans, Tim, and Durk P. Kingma.
316+
"Weight normalization: A simple reparameterization to accelerate
317+
training of deep neural networks."
318+
Advances in neural information processing systems 29 (2016): 901-909.
313319
"""
314320

315321
def __init__(self, latent_dim, act_norm_init, **kwargs):
@@ -353,8 +359,8 @@ def call(self, target, inverse=False):
353359
target : tf.Tensor
354360
If inverse=True: The inversly transformed targets, shape == target.shape
355361
356-
Important
357-
---------
362+
Notes
363+
-----
358364
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
359365
If ``inverse=True``, the return is ``target``.
360366
"""

bayesflow/inference_networks.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -167,8 +167,8 @@ def call(self, targets, condition, inverse=False, **kwargs):
167167
target : tf.Tensor
168168
If inverse=True: The transformed out, shape (batch_size, ...)
169169
170-
Important
171-
---------
170+
Notes
171+
-----
172172
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
173173
If ``inverse=True``, the return is ``target``.
174174
"""

bayesflow/simulation.py

Lines changed: 35 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -52,10 +52,13 @@ class ContextGenerator:
5252
While the latter can also be considered batchable in principle, batching them would require non-Tensor
5353
(i.e., non-rectangular) data structures, which usually means inefficient computations.
5454
55+
Examples
56+
--------
5557
Example for a simulation context which will generate a random number of observations between 1 and 100 for
5658
each training batch:
5759
5860
>>> gen = ContextGenerator(non_batchable_context_fun=lambda : np.random.randint(1, 101))
61+
5962
"""
6063

6164
def __init__(
@@ -103,8 +106,8 @@ def __call__(self, batch_size, *args, **kwargs):
103106
104107
context_dict : dictionary
105108
A dictionary with context variables with the following keys:
106-
`batchable_context` : value
107-
`non_batchable_context` : value
109+
``batchable_context`` : value
110+
``non_batchable_context`` : value
108111
109112
Note, that the values of the context variables will be None, if the
110113
corresponding context-generating functions have not been provided when
@@ -210,7 +213,7 @@ def __init__(
210213
self.is_batched = False
211214

212215
def __call__(self, batch_size, *args, **kwargs):
213-
"""Generates `batch_size` draws from the prior given optional context generator.
216+
"""Generates ``batch_size`` draws from the prior given optional context generator.
214217
215218
Parameters
216219
----------
@@ -313,12 +316,12 @@ def __call__(self, batch_size, *args, **kwargs):
313316

314317
def plot_prior2d(self, **kwargs):
315318
"""Generates a 2D plot representing bivariate prior ditributions. Uses the function
316-
`bayesflow.diagnostics.plot_prior2d() internally for generating the plot.
319+
``bayesflow.diagnostics.plot_prior2d()`` internally for generating the plot.
317320
318321
Parameters
319322
----------
320323
**kwargs : dict
321-
Optional keyword arguments passed to the `plot_prior2d` function.
324+
Optional keyword arguments passed to the ``plot_prior2d`` function.
322325
323326
Returns
324327
-------
@@ -400,9 +403,10 @@ def __init__(
400403
An optional function (ideally an instance of ``ContextGenerator``) for generating control variables
401404
for the local_prior_fun.
402405
403-
Example: Varying number of local factors (e.g., groups, participants) between 1 and 100:
406+
Examples
407+
--------
408+
Varying number of local factors (e.g., groups, participants) between 1 and 100::
404409
405-
``
406410
def draw_hyper():
407411
# Draw location for 2D conditional prior
408412
return np.random.normal(size=2)
@@ -415,6 +419,7 @@ def draw_prior(means, num_groups, sigma=1.):
415419
context = ContextGenerator(non_batchable_context_fun=lambda : np.random.randint(1, 101))
416420
prior = TwoLevelPrior(draw_hyper, draw_prior, local_context_generator=context)
417421
prior_dict = prior(batch_size=32)
422+
418423
"""
419424

420425
self.hyper_prior = hyper_prior_fun
@@ -512,19 +517,19 @@ class Simulator:
512517
513518
An optional context generator (i.e., an instance of ContextGenerator) or a user-defined callable object
514519
implementing the following two methods can be provided:
515-
- context_generator.batchable_context(batch_size)
516-
- context_generator.non_batchable_context()
520+
- ``context_generator.batchable_context(batch_size)``
521+
- ``context_generator.non_batchable_context()``
517522
"""
518523

519524
def __init__(self, batch_simulator_fun=None, simulator_fun=None, context_generator=None):
520525
"""Instantiates a data generator which will perform randomized simulations given a set of parameters and optional context.
521-
Either a batch_simulator_fun or simulator_fun, but not both, should be provided to instantiate a Simulator object.
526+
Either a ``batch_simulator_fun`` or ``simulator_fun``, but not both, should be provided to instantiate a ``Simulator`` object.
522527
523-
If a batch_simulator_fun is provided, the interface will assume that the function operates on batches of parameter
528+
If a ``batch_simulator_fun`` is provided, the interface will assume that the function operates on batches of parameter
524529
vectors and context variables and will pass the latter directly to the function. Power users should attempt to provide
525530
optimized batched simulators.
526531
527-
If a simulator_fun is provided, the interface will assume thatthe function operates on single parameter vectors and
532+
If a ``simulator_fun`` is provided, the interface will assume thatthe function operates on single parameter vectors and
528533
context variables and will wrap the simulator internally to allow batched functionality.
529534
530535
Parameters
@@ -535,8 +540,8 @@ def __init__(self, batch_simulator_fun=None, simulator_fun=None, context_generat
535540
simulator_fun : callable
536541
A function (callable object) with optional control arguments responsible for generating a simulaiton given
537542
a single parameter vector and optional variables.
538-
context generator : callable (default None, recommended instance of ContextGenerator)
539-
An optional function (ideally an instance of ContextGenerator) for generating prior context variables.
543+
context_generator : callable (default None, recommended instance of ContextGenerator)
544+
An optional function (ideally an instance of ``ContextGenerator``) for generating prior context variables.
540545
"""
541546

542547
if (batch_simulator_fun is None) is (simulator_fun is None):
@@ -562,9 +567,9 @@ def __call__(self, params, *args, **kwargs):
562567
563568
out_dict : dictionary
564569
An output dictionary with randomly simulated variables, the following keys are mandatory, if default keys not modified:
565-
`sim_data` : value
566-
`non_batchable_context` : value
567-
`batchable_context` : value
570+
``sim_data`` : value
571+
``non_batchable_context`` : value
572+
``batchable_context`` : value
568573
"""
569574

570575
# Always assume first dimension is batch dimension
@@ -728,8 +733,8 @@ def __init__(
728733
name : str (default - "anonoymous")
729734
An optional name for the generative model. If kept default (None), 'anonymous' is set as name.
730735
731-
Important
732-
----------
736+
Notes
737+
-----
733738
If you are not using the provided ``Prior`` and ``Simulator`` wrappers for your prior and data generator,
734739
only functions returning a ``np.ndarray`` in the correct format will be accepted, since these will be
735740
wrapped internally. In addition, you need to indicate whether your simulator operates on batched of
@@ -761,7 +766,7 @@ def __init__(
761766
self._test()
762767

763768
def __call__(self, batch_size, **kwargs):
764-
"""Carries out forward inference 'batch_size' times."""
769+
"""Carries out forward inference ``batch_size`` times."""
765770

766771
# Forward inference
767772
prior_out = self.prior(batch_size, **kwargs.pop("prior_args", {}))
@@ -780,7 +785,7 @@ def __call__(self, batch_size, **kwargs):
780785
return out_dict
781786

782787
def _config_custom_simulator(self, sim_fun, is_batched):
783-
"""Only called if user has provided a custom simulator not using the Simulator wrapper."""
788+
"""Only called if user has provided a custom simulator not using the ``Simulator`` wrapper."""
784789

785790
if is_batched is None:
786791
raise ConfigurationError(
@@ -796,8 +801,8 @@ def _config_custom_simulator(self, sim_fun, is_batched):
796801
def plot_pushforward(
797802
self, parameter_draws=None, funcs_list=None, funcs_labels=None, batch_size=1000, show_raw_sims=True
798803
):
799-
"""Creates simulations from parameter_draws (generated from self.prior if they are not passed as an argument)
800-
and plots visualizations for them.
804+
"""Creates simulations from ``parameter_draws`` (generated from ``self.prior`` if they are not passed as
805+
an argument) and plots visualizations for them.
801806
802807
Parameters
803808
----------
@@ -959,16 +964,16 @@ def presimulate_and_save(
959964
disable_user_input: bool, optional, default: False
960965
If True, user will not be asked if memory space is sufficient for presimulation.
961966
962-
Important
963-
----------
967+
Notes
968+
-----
964969
One of the following pairs of parameters has to be provided:
965970
966971
- (iterations_per_epoch, epochs),
967972
- (total_iterations, iterations_per_epoch)
968973
- (total_iterations, epochs)
969974
970975
Providing all three of the parameters in these pairs leads to a consistency check,
971-
since incompatible combinations are possible.
976+
since incompatible combinations are possible.
972977
"""
973978
# Ensure that the combination of parameters provided is sufficient to perform presimulation
974979
# and does not contain internal contradictions
@@ -1117,6 +1122,7 @@ def presimulate_and_save(
11171122

11181123
class TwoLevelGenerativeModel:
11191124
"""Basic interface for a generative model in a simulation-based context.
1125+
11201126
Generally, a generative model consists of two mandatory components:
11211127
- MultilevelPrior : A randomized function returning random parameter draws from a two-level prior distribution;
11221128
- Simulator : A function which transforms the parameters into observables in a non-deterministic manner.
@@ -1149,8 +1155,8 @@ def __init__(
11491155
name : str (default - "anonymous")
11501156
An optional name for the generative model.
11511157
1152-
Important
1153-
----------
1158+
Notes
1159+
-----
11541160
If you are not using the provided ``TwoLevelPrior`` and ``Simulator`` wrappers for your prior and data
11551161
generator, only functions returning a ``np.ndarray`` in the correct format will be accepted, since these will be
11561162
wrapped internally. In addition, you need to indicate whether your simulator operates on batched of
@@ -1205,7 +1211,7 @@ def __call__(self, batch_size, **kwargs):
12051211
return out_dict
12061212

12071213
def _config_custom_simulator(self, sim_fun, is_batched):
1208-
"""Only called if user has provided a custom simulator not using the Simulator wrapper."""
1214+
"""Only called if user has provided a custom simulator not using the ``Simulator`` wrapper."""
12091215

12101216
if is_batched is None:
12111217
raise ConfigurationError(

0 commit comments

Comments
 (0)