Skip to content

Commit 51a98cb

Browse files
committed
'merge'
2 parents 72fb8dd + d517b38 commit 51a98cb

File tree

80 files changed

+15842
-285
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

80 files changed

+15842
-285
lines changed

.github/workflows/docs.yml

Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
# From https://github.com/eeholmes/readthedoc-test/blob/main/.github/workflows/docs_pages.yml
2+
name: docs
3+
4+
# execute this workflow automatically when a we push to master
5+
on:
6+
push:
7+
branches:
8+
- master
9+
10+
jobs:
11+
12+
build_docs:
13+
runs-on: ubuntu-latest
14+
15+
steps:
16+
- name: Checkout main
17+
uses: actions/checkout@v3
18+
with:
19+
path: master
20+
21+
- name: Checkout gh-pages
22+
uses: actions/checkout@v3
23+
with:
24+
path: gh-pages
25+
ref: gh-pages
26+
27+
- name: Set up Python
28+
uses: actions/setup-python@v4
29+
with:
30+
python-version: 3.11
31+
cache: 'pip'
32+
33+
- name: Install dependencies
34+
run: |
35+
cd ./master
36+
python -m pip install .[docs]
37+
- name: Make the Sphinx docs
38+
run: |
39+
cd ./master/docsrc
40+
make clean
41+
make github
42+
- name: Commit changes to docs
43+
run: |
44+
cd ./gh-pages
45+
cp -R ../master/docs/* ./
46+
git config --local user.email ""
47+
git config --local user.name "github-actions"
48+
git add -A
49+
if ! git diff-index --quiet HEAD; then
50+
git commit -m "auto: Rebuild docs."
51+
git push
52+
else
53+
echo No commit made because the docs have not changed.
54+
fi

.github/workflows/tests.yml

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,12 @@
11
name: Tests
22

33
on:
4-
- push
5-
- pull_request
4+
pull_request:
5+
push:
6+
branches:
7+
- master
8+
- Development
9+
610

711
jobs:
812
test:

.gitignore

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,10 @@ __pycache__/
44
*/__pycache__/
55
projects/
66
*/bayesflow.egg-info
7-
docs/build/
7+
docsrc/_build/
88
build
9+
docs/
910

10-
# Notebooks
11-
docs/source/tutorial_notebooks/**
1211

1312
# mypy
1413
.mypy_cache
@@ -31,3 +30,6 @@ docs/source/tutorial_notebooks/**
3130

3231
# tox
3332
.tox
33+
34+
# MacOS
35+
.DS_Store

.readthedocs.yaml

Lines changed: 0 additions & 17 deletions
This file was deleted.

CONTRIBUTING.md renamed to CONTRIBUTING

Lines changed: 24 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
Contributing to BayesFlow
2-
==========
2+
=========================
33

44
Workflow
55
--------
@@ -65,3 +65,26 @@ You can run the all tests locally via:
6565
Or a specific test via:
6666

6767
pytest -e test_[mytest]
68+
69+
Tutorial Notebooks
70+
------------------
71+
72+
New tutorial notebooks are always welcome! You can add your tutorial notebook file to `examples/` and add a reference
73+
to the list of notebooks in `docsrc/source/examples.rst`.
74+
Re-build the documentation (see below) and your notebook will be included.
75+
76+
Documentation
77+
-------------
78+
79+
The documentation uses [sphinx](https://www.sphinx-doc.org/) and relies on [numpy style docstrings](https://numpydoc.readthedocs.io/en/latest/format.html) in classes and functions.
80+
The overall *structure* of the documentation is manually designed. This also applies to the API documentation. This has two implications for you:
81+
82+
1. If you add to existing submodules, the documentation will update automatically (given that you use proper numpy docstrings).
83+
2. If you add a new submodule or subpackage, you need to add a file to `docsrc/source/api` and a reference to the new module to the appropriate section of `docsrc/source/api/bayesflow.rst`.
84+
85+
You can re-build the documentation with
86+
87+
cd docsrc/
88+
make clean && make github
89+
90+
The entry point of the rendered documentation will be at `docs/index.html`.

INSTALL.rst

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
1-
Install
2-
=======
1+
Full Installation Instructions
2+
==============================
33

44
Requirements
55
------------
@@ -34,6 +34,15 @@ and activate via
3434
3535
conda activate bf
3636
37+
Install via pip
38+
------------------
39+
40+
Install BayesFlow from PyPI via
41+
42+
.. code-block:: bash
43+
44+
pip install bayesflow
45+
3746
Install from GitHub
3847
-------------------
3948

bayesflow/amortizers.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -219,7 +219,7 @@ def compute_loss(self, input_dict, **kwargs):
219219
# Case dynamic latent space - function of summary conditions
220220
if self.latent_is_dynamic:
221221
logpdf = self.latent_dist(sum_out).log_prob(z)
222-
# Case static latent space
222+
# Case _static latent space
223223
else:
224224
logpdf = self.latent_dist.log_prob(z)
225225

@@ -297,7 +297,7 @@ def sample(self, input_dict, n_samples, to_numpy=True, **kwargs):
297297
if self.latent_is_dynamic:
298298
z_samples = self.latent_dist(conditions).sample(n_samples)
299299
z_samples = tf.transpose(z_samples, (1, 0, 2))
300-
# Case static latent - marginal samples from the specified dist
300+
# Case _static latent - marginal samples from the specified dist
301301
else:
302302
z_samples = self.latent_dist.sample((n_data_sets, n_samples))
303303

@@ -382,7 +382,7 @@ def log_posterior(self, input_dict, to_numpy=True, **kwargs):
382382
# Case dynamic latent - function of conditions
383383
if self.latent_is_dynamic:
384384
log_post = self.latent_dist(conditions).log_prob(z) + log_det_J
385-
# Case static latent - marginal samples from z
385+
# Case _static latent - marginal samples from z
386386
else:
387387
log_post = self.latent_dist.log_prob(z) + log_det_J
388388
self._check_output_sanity(log_post)
@@ -1103,8 +1103,8 @@ def sample(self, input_dict, n_samples, to_numpy=True, **kwargs):
11031103
**kwargs : dict, optional, default: {}
11041104
Additional keyword arguments passed to the summary network as the amortizers
11051105
1106-
Returns:
1107-
--------
1106+
Returns
1107+
-------
11081108
samples_dict : dict
11091109
A dictionary with keys `global_samples` and `local_samples`
11101110
Local samples will hold an array-like of shape (num_replicas, num_samples, num_local)

bayesflow/computational_utilities.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -59,8 +59,8 @@ def posterior_calibration_error(
5959
max_quantile : float in (0, 1), optional, default: 0.995
6060
The maximum posterior quantile to consider
6161
62-
Returns:
63-
--------
62+
Returns
63+
-------
6464
calibration_errors : np.ndarray of shape (num_params, ) or (alpha_resolution, num_params),
6565
if ``aggregator_fun is None``.
6666
The aggregated calibration error per marginal posterior.
@@ -248,8 +248,8 @@ def expected_calibration_error(m_true, m_pred, num_bins=10):
248248
Obtaining well calibrated probabilities using bayesian binning.
249249
In Proceedings of the AAAI conference on artificial intelligence (Vol. 29, No. 1).
250250
251-
Important
252-
---------
251+
Notes
252+
-----
253253
Make sure that ``m_true`` are **one-hot encoded** classes!
254254
255255
Parameters

bayesflow/coupling_networks.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -601,8 +601,8 @@ def call(self, target_or_z, condition, inverse=False, **kwargs):
601601
target : tf.Tensor
602602
If inverse=True: The back-transformed z, shape (batch_size, inp_dim)
603603
604-
Important
605-
---------
604+
Notes
605+
-----
606606
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
607607
If ``inverse=True``, the return is ``target``
608608
"""

bayesflow/helper_networks.py

Lines changed: 21 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -292,24 +292,30 @@ def call(self, inputs):
292292
class ActNorm(tf.keras.Model):
293293
"""Implements an Activation Normalization (ActNorm) Layer.
294294
Activation Normalization is learned invertible normalization, using
295-
a Scale (s) and Bias (b) vector [1].
296-
y = s * x + b (forward)
297-
x = (y - b)/s (inverse)
295+
a Scale (s) and Bias (b) vector::
298296
299-
The scale and bias can be data dependent initalized, such that the
300-
output has a mean of zero and standard deviation of one [1,2].
297+
y = s * x + b (forward)
298+
x = (y - b)/s (inverse)
299+
300+
Notes
301+
-----
302+
303+
The scale and bias can be data dependent initialized, such that the
304+
output has a mean of zero and standard deviation of one [1]_[2]_.
301305
Alternatively, it is initialized with vectors of ones (scale) and
302306
zeros (bias).
303307
304-
[1] - Kingma, Diederik P., and Prafulla Dhariwal.
305-
"Glow: Generative flow with invertible 1x1 convolutions."
306-
arXiv preprint arXiv:1807.03039 (2018).
308+
References
309+
----------
310+
311+
.. [1] Kingma, Diederik P., and Prafulla Dhariwal.
312+
"Glow: Generative flow with invertible 1x1 convolutions."
313+
arXiv preprint arXiv:1807.03039 (2018).
307314
308-
[2] - Salimans, Tim, and Durk P. Kingma.
309-
"Weight normalization: A simple reparameterization to accelerate
310-
training of deep neural networks."
311-
Advances in neural information processing systems 29
312-
(2016): 901-909.
315+
.. [2] Salimans, Tim, and Durk P. Kingma.
316+
"Weight normalization: A simple reparameterization to accelerate
317+
training of deep neural networks."
318+
Advances in neural information processing systems 29 (2016): 901-909.
313319
"""
314320

315321
def __init__(self, latent_dim, act_norm_init, **kwargs):
@@ -353,8 +359,8 @@ def call(self, target, inverse=False):
353359
target : tf.Tensor
354360
If inverse=True: The inversly transformed targets, shape == target.shape
355361
356-
Important
357-
---------
362+
Notes
363+
-----
358364
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
359365
If ``inverse=True``, the return is ``target``.
360366
"""

0 commit comments

Comments
 (0)