Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,11 @@ repos:
rev: 054bda51dbe278b3e86f27c890e3f3ac877d616c
hooks:
- id: validate-cff
- repo: https://github.com/sphinx-contrib/sphinx-lint
rev: v1.0.0
hooks:
- id: sphinx-lint
args: ["."]
- repo: https://github.com/lucianopaz/head_of_apache
rev: "0.0.3"
hooks:
Expand All @@ -31,7 +36,7 @@ repos:
- --exclude=binder/
- --exclude=versioneer.py
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.5.4
rev: v0.6.5
hooks:
- id: ruff
args: ["--fix", "--output-format=full"]
Expand Down
4 changes: 2 additions & 2 deletions docs/source/guides/Gaussian_Processes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ conditioned on.
Calling the `prior` method will create a PyMC random variable that represents
the latent function :math:`f(x) = \mathbf{f}`::

f = gp.prior("f", X)
f = gp.prior("f", X)

:code:`f` is a random variable that can be used within a PyMC model like any
other type of random variable. The first argument is the name of the random
Expand All @@ -166,7 +166,7 @@ Usually at this point, inference is performed on the model. The
distribution over the latent function at arbitrary :math:`x_*` input points,
:math:`f(x_*)`. To construct the conditional distribution we write::

f_star = gp.conditional("f_star", X_star)
f_star = gp.conditional("f_star", X_star)

Additive GPs
============
Expand Down
5 changes: 2 additions & 3 deletions docs/source/learn/core_notebooks/GLM_linear.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -67,9 +67,10 @@
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import pandas as pd\n",
"import pymc as pm\n",
"import xarray as xr\n",
"\n",
"import pymc as pm\n",
"\n",
"from pymc import HalfCauchy, Model, Normal, sample\n",
"\n",
"print(f\"Running on PyMC v{pm.__version__}\")"
Expand Down Expand Up @@ -256,8 +257,6 @@
"metadata": {},
"outputs": [],
"source": [
"import sys\n",
"\n",
"try:\n",
" import bambi as bmb\n",
"except ImportError:\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/source/learn/core_notebooks/Gaussian_Processes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ conditioned on.
Calling the `prior` method will create a PyMC random variable that represents
the latent function :math:`f(x) = \mathbf{f}`::

f = gp.prior("f", X)
f = gp.prior("f", X)

:code:`f` is a random variable that can be used within a PyMC model like any
other type of random variable. The first argument is the name of the random
Expand All @@ -163,7 +163,7 @@ Usually at this point, inference is performed on the model. The
distribution over the latent function at arbitrary :math:`x_*` input points,
:math:`f(x_*)`. To construct the conditional distribution we write::

f_star = gp.conditional("f_star", X_star)
f_star = gp.conditional("f_star", X_star)

.. _additive_gp:

Expand Down
5 changes: 3 additions & 2 deletions docs/source/learn/core_notebooks/dimensionality.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -37,9 +37,10 @@
"source": [
"from functools import partial\n",
"\n",
"import pymc as pm\n",
"import numpy as np\n",
"import pytensor.tensor as pt"
"import pytensor.tensor as pt\n",
"\n",
"import pymc as pm"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/source/learn/core_notebooks/model_comparison.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@
],
"source": [
"import arviz as az\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"import pymc as pm\n",
"\n",
"print(f\"Running on PyMC v{pm.__version__}\")"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,11 +36,11 @@
"import arviz as az\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import pymc as pm\n",
"import xarray as xr\n",
"\n",
"from scipy.special import expit as logistic\n",
"\n",
"import pymc as pm\n",
"\n",
"print(f\"Running on PyMC v{pm.__version__}\")"
]
Expand Down
9,844 changes: 4,922 additions & 4,922 deletions docs/source/learn/core_notebooks/pymc_overview.ipynb

Large diffs are not rendered by default.

13 changes: 7 additions & 6 deletions docs/source/learn/core_notebooks/pymc_pytensor.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -34,12 +34,13 @@
},
"outputs": [],
"source": [
"import pytensor\n",
"import pytensor.tensor as pt\n",
"import pymc as pm\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import scipy.stats"
"import pytensor\n",
"import pytensor.tensor as pt\n",
"import scipy.stats\n",
"\n",
"import pymc as pm"
]
},
{
Expand Down Expand Up @@ -1838,7 +1839,7 @@
"text": [
"\n",
"mu_value -> -1.612085713764618\n",
"sigma_log_value -> -11.324403641427345 \n",
"sigma_log_value -> -11.324403641427345\n",
"x_value -> 9.081061466795328\n",
"\n"
]
Expand All @@ -1848,7 +1849,7 @@
"print(\n",
" f\"\"\"\n",
"mu_value -> {scipy.stats.norm.logpdf(x=0, loc=0, scale=2)}\n",
"sigma_log_value -> {- 10 + scipy.stats.halfnorm.logpdf(x=np.exp(-10), loc=0, scale=3)} \n",
"sigma_log_value -> {- 10 + scipy.stats.halfnorm.logpdf(x=np.exp(-10), loc=0, scale=3)}\n",
"x_value -> {scipy.stats.norm.logpdf(x=0, loc=0, scale=np.exp(-10))}\n",
"\"\"\"\n",
")"
Expand Down
Loading