@@ -18,43 +18,48 @@ class Constrain(ElementwiseTransform):
1818 """
1919 Constrains neural network predictions of a data variable to specified bounds.
2020
21- Parameters:
21+ Parameters
22+ ----------
23+ * : str
2224 String containing the name of the data variable to be transformed e.g. "sigma". See examples below.
2325
24- Named Parameters:
25- lower: Lower bound for named data variable.
26- upper: Upper bound for named data variable.
27- method: Method by which to shrink the network predictions space to specified bounds. Choose from
28- - Double bounded methods: sigmoid, expit, (default = sigmoid)
29- - Lower bound only methods: softplus, exp, (default = softplus)
30- - Upper bound only methods: softplus, exp, (default = softplus)
31- inclusive: Indicates which bounds are inclusive (or exclusive).
32- - "both" (default): Both lower and upper bounds are inclusive.
33- - "lower": Lower bound is inclusive, upper bound is exclusive.
34- - "upper": Lower bound is exclusive, upper bound is inclusive.
35- - "none": Both lower and upper bounds are exclusive.
36- epsilon: Small value to ensure inclusive bounds are not violated.
37- Current default is 1e-15 as this ensures finite outcomes
38- with the default transformations applied to data exactly at the boundaries.
39-
40-
41- Examples:
42- 1) Let sigma be the standard deviation of a normal distribution,
43- then sigma should always be greater than zero.
44-
45- Usage:
46- adapter = (
47- bf.Adapter()
48- .constrain("sigma", lower=0)
49- )
50-
51- 2 ) Suppose p is the parameter for a binomial distribution where p must be in
52- [0,1] then we would constrain the neural network to estimate p in the following way.
53-
54- Usage:
55- >>> import bayesflow as bf
56- >>> adapter = bf.Adapter()
57- >>> adapter.constrain("p", lower=0, upper=1, method="sigmoid", inclusive="both")
26+ lower: int or float or np.darray, optional
27+ Lower bound for named data variable.
28+ upper: int or float or np.darray, optional
29+ Upper bound for named data variable.
30+ method: str, optional
31+ Method by which to shrink the network predictions space to specified bounds. Choose from
32+ - Double bounded methods: sigmoid, expit, (default = sigmoid)
33+ - Lower bound only methods: softplus, exp, (default = softplus)
34+ - Upper bound only methods: softplus, exp, (default = softplus)
35+ inclusive: {'both', 'lower', 'upper', 'none'}, optional
36+ Indicates which bounds are inclusive (or exclusive).
37+ - "both" (default): Both lower and upper bounds are inclusive.
38+ - "lower": Lower bound is inclusive, upper bound is exclusive.
39+ - "upper": Lower bound is exclusive, upper bound is inclusive.
40+ - "none": Both lower and upper bounds are exclusive.
41+ epsilon: float, optional
42+ Small value to ensure inclusive bounds are not violated.
43+ Current default is 1e-15 as this ensures finite outcomes
44+ with the default transformations applied to data exactly at the boundaries.
45+
46+ Examples
47+ --------
48+
49+ 1) Let sigma be the standard deviation of a normal distribution,
50+ then sigma should always be greater than zero.
51+
52+ >>> adapter = (
53+ bf.Adapter()
54+ .constrain("sigma", lower=0)
55+ )
56+
57+ 2) Suppose p is the parameter for a binomial distribution where p must be in
58+ [0,1] then we would constrain the neural network to estimate p in the following way.
59+
60+ >>> import bayesflow as bf
61+ >>> adapter = bf.Adapter()
62+ >>> adapter.constrain("p", lower=0, upper=1, method="sigmoid", inclusive="both")
5863 """
5964
6065 def __init__ (
0 commit comments