Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 36 additions & 0 deletions python/nutpie/transform_adapter.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,42 @@ def make_transform_adapter(
untransformed_dim=None,
zero_init=True,
):
"""
Configure the normalizing flows

Parameters
----------
verbose: bool, default False
If True, print console messages about loss. Warning: This gets spammy fast with multiple chains
window_size: int

show_progress: bool, default False
Show a progress bar for each SGD optimization run
nn_depth: int, default 1
Number of hidden layers in each normalizing flow layer
nn_width: int, optional
Width of the neural network. Trades computation cost during optimizaiton for better normalizing
flows. Can lead to problems if the number of parameters grows too much -- you may need to increase
tuning steps to fit them.

If None, this is set to half the number of parameters being estimated in the posterior
num_layers: int, deafult 8
The number of normalizing flow layers. More layers leads to more numerical instability, but more
freedom in the functional approximation.
num_diag_windows
learning_rate: float, default 1e-3
Learning rate for the ADAM optimizer used in the normalizing flow optimization
scale_layer: False
Experimental, currently unused
untransformed_dim: int, optional
How many posterior parameters are passed unchanged in each flow layer.
zero_init: bool, default True
Experimental, if True it will initialize the network neurons with a small random value.

Returns
-------

"""
import traceback
from functools import partial

Expand Down