Skip to content

Commit 253aaa8

Browse files
Updated docstring for Potential function (#6559)
* Updated docstring for Potential function * Updating docstring according to suggestions * Updated Potential docstring as per suggestion
1 parent 2f945bb commit 253aaa8

File tree

1 file changed

+83
-4
lines changed

1 file changed

+83
-4
lines changed

pymc/model.py

Lines changed: 83 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2042,16 +2042,95 @@ def Deterministic(name, var, model=None, dims=None):
20422042

20432043

20442044
def Potential(name, var, model=None):
2045-
"""Add an arbitrary factor potential to the model likelihood
2045+
"""
2046+
Add an arbitrary factor potential to the model likelihood
2047+
2048+
The Potential function is used to add arbitrary factors (such as constraints or other likelihood components) to adjust the probability density of the model.
2049+
2050+
Warnings
2051+
--------
2052+
Potential functions only influence logp based sampling, like the one used by ``pm.sample``.
2053+
Potentials, modify the log-probability of the model by adding a contribution to the logp which is used by sampling algorithms which rely on the information about the observed data to generate posterior samples.
2054+
Potentials are not applicable in the context of forward sampling because they don't affect the prior distribution itself, only the computation of the logp.
2055+
Forward sampling algorithms generate sample points from the prior distribution of the model, without taking into account the likelihood function.
2056+
In other words, it does not use the information about the observed data.
2057+
Hence, Potentials do not affect forward sampling, which is used by ``sample_prior_predictive`` and ``sample_posterior_predictive``.
2058+
A warning saying "The effect of Potentials on other parameters is ignored during prior predictive sampling" is always emitted to alert user of this.
20462059
20472060
Parameters
20482061
----------
2049-
name: str
2050-
var: PyTensor variables
2062+
name : str
2063+
Name of the potential variable to be registered in the model.
2064+
var : tensor_like
2065+
Expression to be added to the model joint logp.
2066+
model : Model, optional
2067+
The model object to which the potential function is added.
2068+
If ``None`` is provided, the current model is used.
20512069
20522070
Returns
20532071
-------
2054-
var: var, with name attribute
2072+
var : tensor_like
2073+
The registered, named model variable.
2074+
2075+
Examples
2076+
--------
2077+
Have a look at the following example:
2078+
2079+
In this example, we define a constraint on ``x`` to be greater or equal to 0 via the ``pm.Potential`` function.
2080+
We pass ``-pm.math.log(pm.math.switch(constraint, 1, 0))`` as second argument which will return an expression depending on if the constraint is met or not and which will be added to the likelihood of the model.
2081+
The probablity density that this model produces agrees strongly with the constraint that ``x`` should be greater than or equal to 0. All the cases who do not satisfy the constraint are strictly not considered.
2082+
2083+
.. code:: python
2084+
2085+
with pm.Model() as model:
2086+
x = pm.Normal("x", mu=0, sigma=1)
2087+
y = pm.Normal("y", mu=x, sigma=1, observed=data)
2088+
constraint = x >= 0
2089+
potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1, 0.0)))
2090+
2091+
However, if we use ``-pm.math.log(pm.math.switch(constraint, 1, 0.5))`` the potential again penalizes the likelihood when constraint is not met but with some deviations allowed.
2092+
Here, Potential function is used to pass a soft constraint.
2093+
A soft constraint is a constraint that is only partially satisfied.
2094+
The effect of this is that the posterior probability for the parameters decreases as they move away from the constraint, but does not become exactly zero.
2095+
This allows the sampler to generate values that violate the constraint, but with lower probability.
2096+
2097+
.. code:: python
2098+
2099+
with pm.Model() as model:
2100+
x = pm.Normal("x", mu=0.1, sigma=1)
2101+
y = pm.Normal("y", mu=x, sigma=1, observed=data)
2102+
constraint = x >= 0
2103+
potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1, 0.5)))
2104+
2105+
In this example, Potential is used to obtain an arbitrary prior.
2106+
This prior distribution refers to the prior knowledge that the values of ``max_items`` are likely to be small rather than being large.
2107+
The prior probability of ``max_items`` is defined using a Potential object with the log of the inverse of ``max_items`` as its value.
2108+
This means that larger values of ``max_items`` have a lower prior probability density, while smaller values of ``max_items`` have a higher prior probability density.
2109+
When the model is sampled, the posterior distribution of ``max_items`` given the observed value of ``n_items`` will be influenced by the power-law prior defined in the Potential object
2110+
2111+
.. code:: python
2112+
2113+
with pm.Model():
2114+
# p(max_items) = 1 / max_items
2115+
max_items = pm.Uniform("max_items", lower=1, upper=100)
2116+
pm.Potential("power_prior", pm.math.log(1/max_items))
2117+
2118+
n_items = pm.Uniform("n_items", lower=1, upper=max_items, observed=60)
2119+
2120+
In the next example, the ``soft_sum_constraint`` potential encourages ``x`` and ``y`` to have a small sum, effectively adding a soft constraint on the relationship between the two variables.
2121+
This can be useful in cases where you want to ensure that the sum of multiple variables stays within a certain range, without enforcing an exact value.
2122+
In this case, the larger the deviation, larger will be the negative value (-((x + y)**2)) which the MCMC sampler will attempt to minimize.
2123+
However, the sampler might generate values for some small deviations but with lower probability hence this is a soft constraint.
2124+
2125+
.. code:: python
2126+
2127+
with pm.Model() as model:
2128+
x = pm.Normal("x", mu=0.1, sigma=1)
2129+
y = pm.Normal("y", mu=x, sigma=1, observed=data)
2130+
soft_sum_constraint = pm.Potential("soft_sum_constraint", -((x + y)**2))
2131+
2132+
The potential value is incorporated into the model log-probability, so it should be -inf (or very negative) when a constraint is violated, so that those draws are rejected. 0 won't have any effect and positive values will make the proposals more likely to be accepted.
2133+
20552134
"""
20562135
model = modelcontext(model)
20572136
var.name = model.name_for(name)

0 commit comments

Comments
 (0)