Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ on:
paths:
- ".github/workflows/*"
- "pymc_experimental/**"
- "tests/**"
- "setup.py"
- "pyproject.toml"
- "buildosx"
Expand All @@ -20,7 +21,7 @@ jobs:
os: [ubuntu-latest]
python-version: ["3.10"]
test-subset:
- pymc_experimental/tests
- tests
fail-fast: false
runs-on: ${{ matrix.os }}
env:
Expand Down Expand Up @@ -58,7 +59,7 @@ jobs:
os: [windows-latest]
python-version: ["3.12"]
test-subset:
- pymc_experimental/tests
- tests
fail-fast: false
runs-on: ${{ matrix.os }}
env:
Expand Down
112 changes: 111 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,113 @@
# Contributing guide

Page in construction, for now go to https://github.com/pymc-devs/pymc-experimental#questions.
Thank you for your interest in contributing to PyMC and PyMC-experimental!

This page outlines the steps to follow if you wish to contribute to the pymc-experimental repo and clone the repo locally.

## Install locally
**1**: Create a folder `pymc-devs` in your local machine and follow the steps on [cloning PyMC locally](https://www.pymc.io/projects/docs/en/latest/contributing/pr_tutorial.html).

Since PyMC-experimental should integrate with the latest version of PyMC, it is recommended that any development work on PyMC-experimental must also work with the latest version of PyMC.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think that's true, don't we pin a version?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't but we test against the latest


You should now have a local copy of PyMC under `pymc-devs/pymc`.

**2**: Fork the PyMC-experimental repo and clone it locally:

```
git clone [email protected]:<your GitHub handle>/pymc-experimental.git
cd pymc-experimental
git remote add upstream [email protected]:pymc-devs/pymc-experimental.git
```

You should now have a local copy of PyMC-experimental under `pymc-devs/pymc-experimental`.

Create a new conda environment by first installing the dependencies in the main PyMC repo:
```
conda env create -n pymc-experimental -f /path/to/pymc-devs/pymc/conda-envs/environment-dev.yml
conda activate pymc-experimental
pip install -e /path/to/pymc-devs/pymc

# ignores the specific pymc version to install pymc-experimental
pip install -e /path/to/pymc-devs/pymc-experimental --ignore-installed pymc
```

**3** Check origin and upstream is correct.

**PyMC**
```
cd /path/to/pymc-devs/pymc
git remote -v
```
Output:
```
origin [email protected]:<your GitHub handle>/pymc.git (fetch)
origin [email protected]:<your GitHub handle>/pymc.git (push)
upstream [email protected]:pymc-devs/pymc.git (fetch)
upstream [email protected]:pymc-devs/pymc.git (push)
```

**PyMC-experimental**
```
cd /path/to/pymc-devs/pymc-experimental
git remote -v
```
Output:
```
origin [email protected]:<your GitHub handle>/pymc-experimental.git (fetch)
origin [email protected]:<your GitHub handle>/pymc-experimental.git (push)
upstream [email protected]:pymc-devs/pymc-experimental.git (fetch)
upstream [email protected]:pymc-devs/pymc-experimental.git (push)
```



## Git integration [(from PyMC's main page)](https://www.pymc.io/projects/docs/en/latest/contributing/pr_tutorial.html)

**1** Develop the feature on your feature branch:
```
git checkout -b my-exp-feature
```

**2** Before committing, run pre-commit checks:
```
pip install pre-commit
pre-commit run --all # 👈 to run it manually
pre-commit install # 👈 to run it automatically before each commit
```

**3** Add changed files using git add and then git commit files:
```
git add modified_files
git commit
```
to record your changes locally.

**4** After committing, it is a good idea to sync with the base repository in case there have been any changes:
```
# pymc
cd /path/to/pymc-devs/pymc
git fetch upstream
git rebase upstream/main

# (pymc-dev team) Please double check this
pip install -e /path/to/pymc-devs/pymc

# pymc-exp
cd /path/to/pymc-devs/pymc-experimental
git fetch upstream
git rebase upstream/main
```
Then push the changes to the fork in your GitHub account with:
```
git push -u origin my-exp-feature
```

**5** Go to the GitHub web page of your fork of the PyMC repo. Click the ‘Pull request’ button to send your changes to the project’s maintainers for review. This will send a notification to the committers.

## Final steps

Review contributing guide in [PyMC's main page](https://www.pymc.io/projects/docs/en/latest/contributing/index.html).

FAQ [page](https://github.com/pymc-devs/pymc-experimental#questions).

Discussions [page](https://github.com/pymc-devs/pymc-experimental/discussions/5).
File renamed without changes.
2 changes: 1 addition & 1 deletion codecov.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ coverage:
base: auto

ignore:
- "pymc_experimental/tests/*"
- "tests/*"

comment:
layout: "reach, diff, flags, files"
Expand Down
7 changes: 5 additions & 2 deletions pymc_experimental/inference/pathfinder.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,15 +95,18 @@ def fit_pathfinder(
"""
# Temporarily helper
if version.parse(blackjax.__version__).major < 1:
# test
raise ImportError("fit_pathfinder requires blackjax 1.0 or above")

model = modelcontext(model)

ip = model.initial_point()
ip_map = DictToArrayBijection.map(ip)

new_logprob, new_input = pm.pytensorf.join_nonshared_inputs(
ip, (model.logp(),), model.value_vars, ()
ip,
(model.logp(),),
model.value_vars,
(),
)

logprob_fn_list = get_jaxified_graph([new_input], new_logprob)
Expand Down
38 changes: 38 additions & 0 deletions pymc_experimental/model/transforms/autoreparam.py
Original file line number Diff line number Diff line change
Expand Up @@ -246,6 +246,44 @@ def _(
return vip_rep


@_vip_reparam_node.register
def _(
op: pm.Exponential,
node: Apply,
name: str,
dims: List[Variable],
transform: Optional[Transform],
lam: pt.TensorVariable,
) -> ModelDeterministic:
rng, size, scale = node.inputs
scale_centered = scale**lam
scale_noncentered = scale ** (1 - lam)
vip_rv_ = pm.Exponential.dist(
scale=scale_centered,
size=size,
rng=rng,
)
vip_rv_value_ = vip_rv_.clone()
vip_rv_.name = f"{name}::tau_"
if transform is not None:
vip_rv_value_.name = f"{vip_rv_.name}_{transform.name}__"
else:
vip_rv_value_.name = vip_rv_.name
vip_rv = model_free_rv(
vip_rv_,
vip_rv_value_,
transform,
*dims,
)

vip_rep_ = scale_noncentered * vip_rv

vip_rep_.name = name

vip_rep = model_deterministic(vip_rep_, *dims)
return vip_rep


def vip_reparametrize(
model: pm.Model,
var_names: Sequence[str],
Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
is_conditional_dependent,
marginalize,
)
from pymc_experimental.tests.utils import equal_computations_up_to_root
from tests.utils import equal_computations_up_to_root


@pytest.fixture
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ def model_c():
m = pm.Normal("m")
s = pm.LogNormal("s")
pm.Normal("g", m, s, shape=5)
pm.Exponential("e", scale=s, shape=7)
return mod


Expand All @@ -20,31 +21,34 @@ def model_nc():
m = pm.Normal("m")
s = pm.LogNormal("s")
pm.Deterministic("g", pm.Normal("z", shape=5) * s + m)
pm.Deterministic("e", pm.Exponential("z_e", 1, shape=7) * s)
return mod


def test_reparametrize_created(model_c: pm.Model):
model_reparam, vip = vip_reparametrize(model_c, ["g"])
assert "g" in vip.get_lambda()
assert "g::lam_logit__" in model_reparam.named_vars
assert "g::tau_" in model_reparam.named_vars
@pytest.mark.parametrize("var", ["g", "e"])
def test_reparametrize_created(model_c: pm.Model, var):
model_reparam, vip = vip_reparametrize(model_c, [var])
assert f"{var}" in vip.get_lambda()
assert f"{var}::lam_logit__" in model_reparam.named_vars
assert f"{var}::tau_" in model_reparam.named_vars
vip.set_all_lambda(1)
assert ~np.isfinite(model_reparam["g::lam_logit__"].get_value()).any()
assert ~np.isfinite(model_reparam[f"{var}::lam_logit__"].get_value()).any()


def test_random_draw(model_c: pm.Model, model_nc):
@pytest.mark.parametrize("var", ["g", "e"])
def test_random_draw(model_c: pm.Model, model_nc, var):
model_c = pm.do(model_c, {"m": 3, "s": 2})
model_nc = pm.do(model_nc, {"m": 3, "s": 2})
model_v, vip = vip_reparametrize(model_c, ["g"])
assert "g" in [v.name for v in model_v.deterministics]
c = pm.draw(model_c["g"], random_seed=42, draws=1000)
nc = pm.draw(model_nc["g"], random_seed=42, draws=1000)
model_v, vip = vip_reparametrize(model_c, [var])
assert var in [v.name for v in model_v.deterministics]
c = pm.draw(model_c[var], random_seed=42, draws=1000)
nc = pm.draw(model_nc[var], random_seed=42, draws=1000)
vip.set_all_lambda(1)
v_1 = pm.draw(model_v["g"], random_seed=42, draws=1000)
v_1 = pm.draw(model_v[var], random_seed=42, draws=1000)
vip.set_all_lambda(0)
v_0 = pm.draw(model_v["g"], random_seed=42, draws=1000)
v_0 = pm.draw(model_v[var], random_seed=42, draws=1000)
vip.set_all_lambda(0.5)
v_05 = pm.draw(model_v["g"], random_seed=42, draws=1000)
v_05 = pm.draw(model_v[var], random_seed=42, draws=1000)
np.testing.assert_allclose(c.mean(), nc.mean())
np.testing.assert_allclose(c.mean(), v_0.mean())
np.testing.assert_allclose(v_05.mean(), v_1.mean())
Expand All @@ -57,10 +61,12 @@ def test_random_draw(model_c: pm.Model, model_nc):


def test_reparam_fit(model_c):
model_v, vip = vip_reparametrize(model_c, ["g"])
vars = ["g", "e"]
model_v, vip = vip_reparametrize(model_c, ["g", "e"])
with model_v:
vip.fit(random_seed=42)
np.testing.assert_allclose(vip.get_lambda()["g"], 0, atol=0.01)
vip.fit(50000, random_seed=42)
for var in vars:
np.testing.assert_allclose(vip.get_lambda()[var], 0, atol=0.01)


def test_multilevel():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,10 @@
SARIMAX_STATE_STRUCTURES,
SHORT_NAME_TO_LONG,
)
from pymc_experimental.tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
from tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
rng,
)
from pymc_experimental.tests.statespace.utilities.test_helpers import (
from tests.statespace.utilities.test_helpers import (
load_nile_test_data,
make_stationary_params,
simulate_from_numpy_model,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

from pymc_experimental.statespace import BayesianVARMAX
from pymc_experimental.statespace.utils.constants import SHORT_NAME_TO_LONG
from pymc_experimental.tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
from tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
rng,
)

Expand All @@ -25,7 +25,7 @@
@pytest.fixture(scope="session")
def data():
df = pd.read_csv(
"pymc_experimental/tests/statespace/test_data/statsmodels_macrodata_processed.csv",
"tests/statespace/test_data/statsmodels_macrodata_processed.csv",
index_col=0,
parse_dates=True,
).astype(floatX)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,7 @@
NO_FREQ_INFO_WARNING,
NO_TIME_INDEX_WARNING,
)
from pymc_experimental.tests.statespace.utilities.test_helpers import (
load_nile_test_data,
)
from tests.statespace.utilities.test_helpers import load_nile_test_data

function_names = ["pandas_date_freq", "pandas_date_nofreq", "pandas_nodate", "numpy", "pytensor"]
expected_warning = [
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,10 @@
OBS_STATE_DIM,
TIME_DIM,
)
from pymc_experimental.tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
from tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
rng,
)
from pymc_experimental.tests.statespace.utilities.test_helpers import (
from tests.statespace.utilities.test_helpers import (
delete_rvs_from_model,
fast_eval,
load_nile_test_data,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@
UnivariateFilter,
)
from pymc_experimental.statespace.filters.kalman_filter import BaseFilter
from pymc_experimental.tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
from tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
rng,
)
from pymc_experimental.tests.statespace.utilities.test_helpers import (
from tests.statespace.utilities.test_helpers import (
get_expected_shape,
get_sm_state_from_output_name,
initialize_filter,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,8 @@
from numpy.testing import assert_allclose

from pymc_experimental.statespace.core.representation import PytensorRepresentation
from pymc_experimental.tests.statespace.utilities.shared_fixtures import TEST_SEED
from pymc_experimental.tests.statespace.utilities.test_helpers import (
fast_eval,
make_test_inputs,
)
from tests.statespace.utilities.shared_fixtures import TEST_SEED
from tests.statespace.utilities.test_helpers import fast_eval, make_test_inputs

floatX = pytensor.config.floatX
atol = 1e-12 if floatX == "float64" else 1e-6
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@
MATRIX_NAMES,
SMOOTHER_OUTPUT_NAMES,
)
from pymc_experimental.tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
from tests.statespace.utilities.shared_fixtures import ( # pylint: disable=unused-import
rng,
)
from pymc_experimental.tests.statespace.utilities.test_helpers import (
from tests.statespace.utilities.test_helpers import (
fast_eval,
load_nile_test_data,
make_test_inputs,
Expand Down
Loading
Loading