You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Compute the log likelihood of a multivariate normal distribution in precision form. May be phased out - see https://github.com/pymc-devs/pymc/pull/7895
403
403
404
404
Parameters
405
405
----------
406
-
value: TODO
406
+
value: TensorLike
407
407
Query point to compute the log prob at.
408
-
mean: TODO
408
+
mean: TensorLike
409
409
Mean vector of the Gaussian,
410
-
tau: TODO
410
+
tau: TensorLike
411
411
Precision matrix of the Gaussian (i.e. cov = inv(tau))
412
+
413
+
Returns
414
+
-------
415
+
logp: TensorLike
416
+
Log likelihood at value.
417
+
posdef: TensorLike
418
+
Boolean indicating whether the precision matrix is positive definite.
# logp(x | y, params) using laplace approx evaluated at x0
475
+
# This step is also expensive (but not as much as minimize). Could be made more efficient by recycling hessian from the minimizer step, however that requires a bespoke algorithm described in Rasmussen & Williams
476
+
# since the general optimisation scheme maximises logp(x | y, params) rather than logp(y | x, params), and thus the hessian that comes out of methods
477
+
# like L-BFGS-B is in fact not the hessian of logp(y | x, params)
478
+
hess=pytensor.gradient.hessian(log_likelihood, x)
479
+
480
+
# Evaluate logp of Laplace approx N(x*, Q - f"(x*)) at some point x
# logp(x | y, params) using laplace approx evaluated at x0
459
-
# This step is also expensive (but not as much as minimize). Could be made more efficient by recycling hessian from the minimizer step, however that requires a bespoke algorithm described in Rasmussen & Williams
460
-
# since the general optimisation scheme maximises logp(x | y, params) rather than logp(y | x, params), and thus the hessian that comes out of methods
461
-
# like L-BFGS-B is in fact not the hessian of logp(y | x, params)
0 commit comments