diff --git a/.gitignore b/.gitignore index f4a12cfd..c25d86fe 100644 --- a/.gitignore +++ b/.gitignore @@ -88,5 +88,6 @@ ENV/ # Rope project settings .ropeproject -# other +# pixi environments .pixi +*.egg-info diff --git a/README.md b/README.md index 73024374..e837d580 100644 --- a/README.md +++ b/README.md @@ -541,16 +541,17 @@ See [example](example/) for more information or check the output of `unidep -h` ```bash usage: unidep [-h] - {merge,install,install-all,conda-lock,pip-compile,pip,conda,version} ... + {merge,install,install-all,conda-lock,pixi-lock,pip-compile,pip,conda,version} ... Unified Conda and Pip requirements management. positional arguments: - {merge,install,install-all,conda-lock,pip-compile,pip,conda,version} + {merge,install,install-all,conda-lock,pixi-lock,pip-compile,pip,conda,version} Subcommands merge Combine multiple (or a single) `requirements.yaml` or `pyproject.toml` files into a single Conda installable - `environment.yaml` file. + `environment.yaml` file or Pixi installable + `pixi.toml` file. install Automatically install all dependencies from one or more `requirements.yaml` or `pyproject.toml` files. This command first installs dependencies with Conda, @@ -569,6 +570,11 @@ positional arguments: lock.yml` files for each `requirements.yaml` or `pyproject.toml` file consistent with the global lock file. + pixi-lock Generate a global `pixi.lock` file for a collection of + `requirements.yaml` or `pyproject.toml` files. + Additionally, create individual `pixi.lock` files for + each `requirements.yaml` or `pyproject.toml` file + consistent with the global lock file. pip-compile Generate a fully pinned `requirements.txt` file from one or more `requirements.yaml` or `pyproject.toml` files using `pip-compile` from `pip-tools`. This @@ -603,23 +609,24 @@ See `unidep merge -h` for more information: ```bash usage: unidep merge [-h] [-o OUTPUT] [-n NAME] [--stdout] - [--selector {sel,comment}] [-d DIRECTORY] [--depth DEPTH] - [-v] + [--selector {sel,comment}] [--pixi] [-d DIRECTORY] + [--depth DEPTH] [-v] [-p {linux-64,linux-aarch64,linux-ppc64le,osx-64,osx-arm64,win-64}] [--skip-dependency SKIP_DEPENDENCY] [--ignore-pin IGNORE_PIN] [--overwrite-pin OVERWRITE_PIN] Combine multiple (or a single) `requirements.yaml` or `pyproject.toml` files -into a single Conda installable `environment.yaml` file. Example usage: -`unidep merge --directory . --depth 1 --output environment.yaml` to search for -`requirements.yaml` or `pyproject.toml` files in the current directory and its -subdirectories and create `environment.yaml`. These are the defaults, so you -can also just run `unidep merge`. +into a single Conda installable `environment.yaml` file or Pixi installable +`pixi.toml` file. Example usage: `unidep merge --directory . --depth 1 +--output environment.yaml` to search for `requirements.yaml` or +`pyproject.toml` files in the current directory and its subdirectories and +create `environment.yaml`. These are the defaults, so you can also just run +`unidep merge`. options: -h, --help show this help message and exit -o, --output OUTPUT Output file for the conda environment, by default - `environment.yaml` + `environment.yaml` or `pixi.toml` if `--pixi` is used -n, --name NAME Name of the conda environment, by default `myenv` --stdout Output to stdout instead of a file --selector {sel,comment} @@ -627,6 +634,8 @@ options: `sel` then `- numpy # [linux]` becomes `sel(linux): numpy`, if `comment` then it remains `- numpy # [linux]`, by default `sel` + --pixi Generate a `pixi.toml` file instead of + `environment.yaml` -d, --directory DIRECTORY Base directory to scan for `requirements.yaml` or `pyproject.toml` file(s), by default `.` diff --git a/example/environment.yaml b/example/environment.yaml index 9b417266..c314d7e7 100644 --- a/example/environment.yaml +++ b/example/environment.yaml @@ -18,7 +18,8 @@ dependencies: - pytest - pytest-cov - pip: - - unidep + - unidep; sys_platform == 'linux' and platform_machine == 'x86_64' + - unidep; sys_platform == 'darwin' - markdown-code-runner - numthreads - yaml2bib; sys_platform == 'linux' and platform_machine == 'x86_64' diff --git a/example/hatch_project/requirements.yaml b/example/hatch_project/requirements.yaml index 60d99708..b0e988df 100644 --- a/example/hatch_project/requirements.yaml +++ b/example/hatch_project/requirements.yaml @@ -3,7 +3,7 @@ channels: - conda-forge dependencies: - conda: adaptive-scheduler # [linux64] - - pip: unidep + - pip: unidep # [linux64] - numpy >=1.21 - hpc05 # [linux64] - pandas >=1,<3 diff --git a/pixi_create_sub_lock_file.py b/pixi_create_sub_lock_file.py new file mode 100644 index 00000000..b4d0a4be --- /dev/null +++ b/pixi_create_sub_lock_file.py @@ -0,0 +1,154 @@ +"""Create a subset of a lock file with a subset of packages.""" + +from __future__ import annotations + +import asyncio +import json +import os +import tempfile +from collections import defaultdict + +from rattler import ( + Environment, + GenericVirtualPackage, + LockFile, + Platform, + Version, + solve_with_sparse_repodata, +) +from rattler.channel import Channel, ChannelConfig +from rattler.match_spec import MatchSpec +from rattler.repo_data import SparseRepoData + + +def create_repodata_from_pixi_lock(lock_file_path: str) -> dict[str, dict]: + """Create repodata from a pixi lock file.""" + lock_file = LockFile.from_path(lock_file_path) + env = lock_file.default_environment() + repodata = {} + for platform in env.platforms(): + subdir = str(platform) + packages = env.conda_repodata_records_for_platform(platform) + if not packages: + continue + + repodata[subdir] = { + "info": { + "subdir": subdir, + "base_url": f"https://conda.anaconda.org/conda-forge/{subdir}", + }, + "packages": { + f"{pkg.name.normalized}-{pkg.version}-{pkg.build}.conda": { + "build": pkg.build, + "build_number": pkg.build_number, + "depends": pkg.depends, + "constrains": pkg.constrains, + "license": pkg.license, + "license_family": pkg.license_family, + "md5": pkg.md5.hex() if pkg.md5 else None, + "name": pkg.name.normalized, + "sha256": pkg.sha256.hex() if pkg.sha256 else None, + "size": pkg.size, + "subdir": pkg.subdir, + "timestamp": int(pkg.timestamp.timestamp() * 1000) + if pkg.timestamp + else None, + "version": str(pkg.version), + } + for pkg in packages + }, + "repodata_version": 2, + } + return repodata + + +def _version_requirement_to_lowest_version(version: str | None) -> str | None: + if version is None: + return None + if version.startswith(">="): + version = version[2:] + if version.startswith("=="): + version = version[2:] + version = version.split(",")[0] + return version # noqa: RET504 + + +def all_virtual_packages(env: Environment) -> dict[Platform, set[str]]: + """Get all virtual packages from an environment.""" + virtual_packages = defaultdict(set) + for platform, packages in env.packages_by_platform().items(): + for package in packages: + if not package.is_conda: + continue + repo_record = package.as_conda() + for dep in repo_record.depends: + spec = MatchSpec(dep) + if spec.name.normalized.startswith("__"): + version = _version_requirement_to_lowest_version(spec.version) + virtual_package = GenericVirtualPackage( + spec.name, + version=Version(version or "0"), + build_string=spec.build or "*", + ) + virtual_packages[platform].add(virtual_package) + return virtual_packages + + +async def create_subset_lock_file( + original_lock_file_path: str, + required_packages: list[str], + platform: Platform, +) -> LockFile: + """Create a new lock file with a subset of packages from original lock file.""" + original_lock_file = LockFile.from_path(original_lock_file_path) + env = original_lock_file.default_environment() + conda_records = env.conda_repodata_records_for_platform(platform) + if conda_records is None: + msg = f"No conda records found for platform {platform}" + raise ValueError(msg) + repodata = create_repodata_from_pixi_lock(original_lock_file_path) + platform_repodata = repodata.get(str(platform)) + if platform_repodata is None: + msg = f"No repodata found for platform {platform}" + raise ValueError(msg) + + with tempfile.NamedTemporaryFile( + mode="w", + delete=False, + suffix=".json", + ) as temp_file: + json.dump(platform_repodata, temp_file) + temp_file_path = temp_file.name + print(f"Temporary repodata file: {temp_file_path}") + dummy_channel = Channel("dummy", ChannelConfig()) + sparse_repo_data = SparseRepoData(dummy_channel, str(platform), temp_file_path) + specs = [MatchSpec(pkg) for pkg in required_packages] + virtual_packages = all_virtual_packages(env)[platform] + + solved_records = await solve_with_sparse_repodata( + specs=specs, + sparse_repodata=[sparse_repo_data], + locked_packages=conda_records, + virtual_packages=virtual_packages, + ) + new_env = Environment("new_env", {platform: solved_records}) + new_lock_file = LockFile({"new_env": new_env}) + os.unlink(temp_file_path) # noqa: PTH108 + return new_lock_file + + +async def main() -> None: + """Example usage of create_subset_lock_file.""" + original_lock_file_path = "pixi.lock" + required_packages = ["tornado", "scipy", "ipykernel", "adaptive"] + platform = Platform("linux-64") + new_lock_file = await create_subset_lock_file( + original_lock_file_path, + required_packages, + platform, + ) + new_lock_file.to_path("new_lock_file.lock") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/tests/simple_monorepo/common-requirements.yaml b/tests/simple_monorepo/common-requirements.yaml index 35bc8687..aec7b789 100644 --- a/tests/simple_monorepo/common-requirements.yaml +++ b/tests/simple_monorepo/common-requirements.yaml @@ -5,3 +5,6 @@ channels: - conda-forge dependencies: - conda: python_abi +platforms: + - osx-64 + - osx-arm64 diff --git a/tests/simple_monorepo/pixi.lock b/tests/simple_monorepo/pixi.lock new file mode 100644 index 00000000..6b535804 --- /dev/null +++ b/tests/simple_monorepo/pixi.lock @@ -0,0 +1,97 @@ +# This file is created and managed by `unidep` 0.67.3. +# For details see https://github.com/basnijholt/unidep +# File generated with: `unidep pixi-lock` +# +# This environment can be installed with +# `pixi install` +# This file is a `pixi.lock` file generated via `unidep`. +# For details see https://pixi.sh/ + +version: 6 +environments: + default: + channels: + - url: https://conda.anaconda.org/conda-forge/ + packages: + osx-64: + - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda + - conda: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.13-5_cp313t.conda + osx-arm64: + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/python_abi-3.13-5_cp313t.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2025a-h78e105d_0.conda + project1: + channels: + - url: https://conda.anaconda.org/conda-forge/ + packages: + osx-64: + - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda + osx-arm64: + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda + project2: + channels: + - url: https://conda.anaconda.org/conda-forge/ + packages: + osx-arm64: + - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2025a-h78e105d_0.conda + simple-monorepo: + channels: + - url: https://conda.anaconda.org/conda-forge/ + packages: + osx-64: + - conda: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.13-5_cp313t.conda + osx-arm64: + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/python_abi-3.13-5_cp313t.conda +packages: +- conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda + sha256: cad153608b81fb24fc8c509357daa9ae4e49dfc535b2cb49b91e23dbd68fc3c5 + md5: 7ed4301d437b59045be7e051a0308211 + depends: + - __osx >=10.13 + arch: x86_64 + platform: osx + license: bzip2-1.0.6 + license_family: BSD + size: 134188 + timestamp: 1720974491916 +- conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda + sha256: adfa71f158cbd872a36394c56c3568e6034aa55c623634b37a4836bd036e6b91 + md5: fc6948412dbbbe9a4c9ddbbcfe0a79ab + depends: + - __osx >=11.0 + arch: arm64 + platform: osx + license: bzip2-1.0.6 + license_family: BSD + size: 122909 + timestamp: 1720974522888 +- conda: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.13-5_cp313t.conda + build_number: 5 + sha256: a96553de64be6441400e88c2c6ad7123d91cbcea4898b5966a653163f30d9f55 + md5: 32ba8fc57ccb0b48dd6006974f65c525 + constrains: + - python 3.13.* *_cp313t + arch: x86_64 + platform: osx + license: BSD-3-Clause + license_family: BSD + size: 6300 + timestamp: 1723823108577 +- conda: https://conda.anaconda.org/conda-forge/osx-arm64/python_abi-3.13-5_cp313t.conda + build_number: 5 + sha256: 2165466ff175e1890b66d079d64449a1b6dd9873fb0f5e977839ccc4639b813b + md5: 24a9a05eba65586da53ad7b56a06dc02 + constrains: + - python 3.13.* *_cp313t + arch: arm64 + platform: osx + license: BSD-3-Clause + license_family: BSD + size: 6317 + timestamp: 1723823118660 +- conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2025a-h78e105d_0.conda + sha256: c4b1ae8a2931fe9b274c44af29c5475a85b37693999f8c792dad0f8c6734b1de + md5: dbcace4706afdfb7eb891f7b37d07c04 + license: LicenseRef-Public-Domain + size: 122921 + timestamp: 1737119101255 diff --git a/tests/simple_monorepo/pixi.toml b/tests/simple_monorepo/pixi.toml new file mode 100644 index 00000000..7b3d2dee --- /dev/null +++ b/tests/simple_monorepo/pixi.toml @@ -0,0 +1,40 @@ +[project] +name = "myenv" +platforms = [ + "osx-64", + "osx-arm64", +] +channels = [ + "conda-forge", +] + +[dependencies] + +[pypi-dependencies] + +[target] + +[feature.project1.dependencies] +bzip2 = "*" + +[feature.project2.target.osx-arm64.dependencies] +tzdata = "*" + +[feature.simple_monorepo.dependencies] +python_abi = "*" + +[environments] +default = [ + "project1", + "project2", + "simple_monorepo", +] +project1 = [ + "project1", +] +project2 = [ + "project2", +] +simple-monorepo = [ + "simple_monorepo", +] diff --git a/tests/simple_monorepo/project1/pixi.lock b/tests/simple_monorepo/project1/pixi.lock new file mode 100644 index 00000000..b5b303ec --- /dev/null +++ b/tests/simple_monorepo/project1/pixi.lock @@ -0,0 +1,33 @@ +version: 6 +environments: + default: + channels: + - url: https://conda.anaconda.org/conda-forge/ + packages: + osx-64: + - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda + osx-arm64: + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda +packages: +- conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda + sha256: cad153608b81fb24fc8c509357daa9ae4e49dfc535b2cb49b91e23dbd68fc3c5 + md5: 7ed4301d437b59045be7e051a0308211 + depends: + - __osx >=10.13 + arch: x86_64 + platform: osx + license: bzip2-1.0.6 + license_family: BSD + size: 134188 + timestamp: 1720974491916 +- conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda + sha256: adfa71f158cbd872a36394c56c3568e6034aa55c623634b37a4836bd036e6b91 + md5: fc6948412dbbbe9a4c9ddbbcfe0a79ab + depends: + - __osx >=11.0 + arch: arm64 + platform: osx + license: bzip2-1.0.6 + license_family: BSD + size: 122909 + timestamp: 1720974522888 diff --git a/tests/simple_monorepo/project2/pixi.lock b/tests/simple_monorepo/project2/pixi.lock new file mode 100644 index 00000000..0576373b --- /dev/null +++ b/tests/simple_monorepo/project2/pixi.lock @@ -0,0 +1,15 @@ +version: 6 +environments: + default: + channels: + - url: https://conda.anaconda.org/conda-forge/ + packages: + osx-arm64: + - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2025a-h78e105d_0.conda +packages: +- conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2025a-h78e105d_0.conda + sha256: c4b1ae8a2931fe9b274c44af29c5475a85b37693999f8c792dad0f8c6734b1de + md5: dbcace4706afdfb7eb891f7b37d07c04 + license: LicenseRef-Public-Domain + size: 122921 + timestamp: 1737119101255 diff --git a/tests/test_pixi.py b/tests/test_pixi.py new file mode 100644 index 00000000..acf12ff6 --- /dev/null +++ b/tests/test_pixi.py @@ -0,0 +1,80 @@ +"""unidep tests.""" + +from __future__ import annotations + +import textwrap +from typing import TYPE_CHECKING + +import pytest + +from unidep import ( + parse_requirements, + resolve_conflicts, +) +from unidep._dependencies_parsing import yaml_to_toml +from unidep._pixi import generate_pixi_toml + +if TYPE_CHECKING: + import sys + from pathlib import Path + + if sys.version_info >= (3, 8): + from typing import Literal + else: # pragma: no cover + from typing_extensions import Literal + + +def maybe_as_toml(toml_or_yaml: Literal["toml", "yaml"], p: Path) -> Path: + if toml_or_yaml == "toml": + toml = yaml_to_toml(p) + p.unlink() + p = p.with_name("pyproject.toml") + p.write_text(toml) + return p + + +@pytest.mark.parametrize("toml_or_yaml", ["toml", "yaml"]) +def test_filter_python_dependencies_with_platforms( + toml_or_yaml: Literal["toml", "yaml"], + tmp_path: Path, +) -> None: + p = tmp_path / "requirements.yaml" + p.write_text( + textwrap.dedent( + """\ + channels: + - conda-forge + dependencies: + - foo # [unix] + platforms: + - linux-64 + """, + ), + ) + p = maybe_as_toml(toml_or_yaml, p) + requirements = parse_requirements(p, verbose=False) + resolved = resolve_conflicts(requirements.requirements, ["linux-64"]) + output_file = tmp_path / "pixi.toml" + generate_pixi_toml( + resolved, + project_name=None, + channels=requirements.channels, + platforms=requirements.platforms, + output_file=output_file, + verbose=False, + ) + assert output_file.read_text() == textwrap.dedent( + """\ + [project] + name = "unidep" + platforms = [ + "linux-64", + ] + channels = [ + "conda-forge", + ] + + [target.linux-64.dependencies] + foo = "*" + """, + ) diff --git a/tests/test_pixi_lock.py b/tests/test_pixi_lock.py new file mode 100644 index 00000000..f819c85f --- /dev/null +++ b/tests/test_pixi_lock.py @@ -0,0 +1,70 @@ +"""unidep pixi-lock tests.""" + +from __future__ import annotations + +import shutil +from pathlib import Path +from unittest.mock import patch + +from ruamel.yaml import YAML + +from unidep._pixi_lock import pixi_lock_command + + +def test_pixi_lock_command(tmp_path: Path) -> None: + folder = tmp_path / "simple_monorepo" + shutil.copytree(Path(__file__).parent / "simple_monorepo", folder) + with patch("unidep._pixi_lock._run_pixi_lock", return_value=None): + pixi_lock_command( + depth=1, + directory=folder, + files=None, + platforms=["osx-64", "osx-arm64"], + verbose=True, + only_global=False, + ignore_pins=[], + overwrite_pins=[], + skip_dependencies=[], + extra_flags=["--", "--micromamba"], + ) + with YAML(typ="safe") as yaml: + with (folder / "project1" / "pixi.lock").open() as f: + lock1 = yaml.load(f) + with (folder / "project2" / "pixi.lock").open() as f: + lock2 = yaml.load(f) + assert lock1["environments"]["default"]["packages"] == { + "osx-64": [ + { + "conda": "https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda", + }, + { + "conda": "https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.13-5_cp313t.conda", + }, + ], + "osx-arm64": [ + { + "conda": "https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda", + }, + { + "conda": "https://conda.anaconda.org/conda-forge/noarch/tzdata-2024b-hc8b5060_0.conda", + }, + { + "conda": "https://conda.anaconda.org/conda-forge/osx-arm64/python_abi-3.13-5_cp313t.conda", + }, + ], + } + assert lock2["environments"]["default"]["packages"] == { + "osx-64": [ + { + "conda": "https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.13-5_cp313t.conda", + }, + ], + "osx-arm64": [ + { + "conda": "https://conda.anaconda.org/conda-forge/noarch/tzdata-2024b-hc8b5060_0.conda", + }, + { + "conda": "https://conda.anaconda.org/conda-forge/osx-arm64/python_abi-3.13-5_cp313t.conda", + }, + ], + } diff --git a/tests/test_project_dependency_handling.py b/tests/test_project_dependency_handling.py index 574b8eab..0f65b708 100644 --- a/tests/test_project_dependency_handling.py +++ b/tests/test_project_dependency_handling.py @@ -103,30 +103,69 @@ def test_project_dependency_handling_in_pyproject_toml( expected = { "python-graphviz": [ - Spec(name="python-graphviz", which="conda", identifier="17e5d607"), + Spec( + name="python-graphviz", + which="conda", + identifier="17e5d607", + origin=(p,), + ), ], "graphviz": [ - Spec(name="graphviz", which="pip", identifier="17e5d607"), - Spec(name="graphviz", which="conda", identifier="5eb93b8c"), + Spec(name="graphviz", which="pip", identifier="17e5d607", origin=(p,)), + Spec(name="graphviz", which="conda", identifier="5eb93b8c", origin=(p,)), ], } if project_dependency_handling == "pip-only": expected.update( { - "requests": [Spec(name="requests", which="pip", identifier="08fd8713")], - "pandas": [Spec(name="pandas", which="pip", identifier="9e467fa1")], + "requests": [ + Spec( + name="requests", + which="pip", + identifier="08fd8713", + origin=(p,), + ), + ], + "pandas": [ + Spec( + name="pandas", + which="pip", + identifier="9e467fa1", + origin=(p,), + ), + ], }, ) elif project_dependency_handling == "same-name": expected.update( { "requests": [ - Spec(name="requests", which="conda", identifier="08fd8713"), - Spec(name="requests", which="pip", identifier="08fd8713"), + Spec( + name="requests", + which="conda", + identifier="08fd8713", + origin=(p,), + ), + Spec( + name="requests", + which="pip", + identifier="08fd8713", + origin=(p,), + ), ], "pandas": [ - Spec(name="pandas", which="conda", identifier="9e467fa1"), - Spec(name="pandas", which="pip", identifier="9e467fa1"), + Spec( + name="pandas", + which="conda", + identifier="9e467fa1", + origin=(p,), + ), + Spec( + name="pandas", + which="pip", + identifier="9e467fa1", + origin=(p,), + ), ], }, ) diff --git a/tests/test_unidep.py b/tests/test_unidep.py index 9a0b34b8..71b5c52d 100644 --- a/tests/test_unidep.py +++ b/tests/test_unidep.py @@ -130,6 +130,7 @@ def test_parse_requirements( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", @@ -137,18 +138,21 @@ def test_parse_requirements( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", which="conda", selector="unix", identifier="530d9eaa", + origin=(p,), ), Spec( name="foo", which="pip", selector="unix", identifier="530d9eaa", + origin=(p,), ), ], "bar": [ @@ -157,22 +161,26 @@ def test_parse_requirements( which="conda", pin=">1", identifier="08fd8713", + origin=(p,), ), Spec( name="bar", which="pip", pin=">1", identifier="08fd8713", + origin=(p,), ), Spec( name="bar", which="conda", identifier="9e467fa1", + origin=(p,), ), Spec( name="bar", which="pip", identifier="9e467fa1", + origin=(p,), ), ], } @@ -421,12 +429,14 @@ def test_surrounding_comments( which="conda", selector="osx", identifier="8b0c4c31", + origin=(p,), ), Spec( name="yolo", which="pip", selector="osx", identifier="8b0c4c31", + origin=(p,), ), ], "foo": [ @@ -435,12 +445,14 @@ def test_surrounding_comments( which="conda", selector="linux", identifier="ecd4baa6", + origin=(p,), ), Spec( name="foo", which="pip", selector="linux", identifier="ecd4baa6", + origin=(p,), ), ], "bar": [ @@ -449,12 +461,14 @@ def test_surrounding_comments( which="conda", selector="win", identifier="8528de75", + origin=(p,), ), Spec( name="bar", which="pip", selector="win", identifier="8528de75", + origin=(p,), ), ], "baz": [ @@ -462,14 +476,21 @@ def test_surrounding_comments( name="baz", which="conda", identifier="9e467fa1", + origin=(p,), + ), + Spec( + name="baz", + which="pip", + identifier="9e467fa1", + origin=(p,), ), - Spec(name="baz", which="pip", identifier="9e467fa1"), ], "pip-package": [ Spec( name="pip-package", which="pip", identifier="5813b64a", + origin=(p,), ), ], "pip-package2": [ @@ -478,6 +499,7 @@ def test_surrounding_comments( which="pip", selector="osx", identifier="1c0fa4c4", + origin=(p,), ), ], } @@ -513,6 +535,7 @@ def test_filter_pip_and_conda( which="conda", selector="linux64", identifier="c292b98a", + origin=(p,), ), ], "package2": [ @@ -521,6 +544,7 @@ def test_filter_pip_and_conda( which="conda", selector="osx64", identifier="b2ac468f", + origin=(p,), ), ], "package3": [ @@ -528,6 +552,7 @@ def test_filter_pip_and_conda( name="package3", which="pip", identifier="08fd8713", + origin=(p,), ), ], "package4": [ @@ -536,6 +561,7 @@ def test_filter_pip_and_conda( which="pip", selector="unix", identifier="1d5d7757", + origin=(p,), ), ], "common_package": [ @@ -544,12 +570,14 @@ def test_filter_pip_and_conda( which="conda", selector="unix", identifier="f78244dc", + origin=(p,), ), Spec( name="common_package", which="pip", selector="unix", identifier="f78244dc", + origin=(p,), ), ], "shared_package": [ @@ -558,12 +586,14 @@ def test_filter_pip_and_conda( which="conda", selector="linux64", identifier="1599d575", + origin=(p,), ), Spec( name="shared_package", which="pip", selector="win64", identifier="46630b59", + origin=(p,), ), ], } @@ -580,6 +610,7 @@ def test_filter_pip_and_conda( which="conda", selector="linux64", identifier="c292b98a", + origin=(p,), ), }, }, @@ -590,6 +621,7 @@ def test_filter_pip_and_conda( which="conda", selector="osx64", identifier="b2ac468f", + origin=(p,), ), }, }, @@ -599,6 +631,7 @@ def test_filter_pip_and_conda( name="package3", which="pip", identifier="08fd8713", + origin=(p,), ), }, }, @@ -609,6 +642,7 @@ def test_filter_pip_and_conda( which="pip", selector="unix", identifier="1d5d7757", + origin=(p,), ), }, "linux-aarch64": { @@ -617,6 +651,7 @@ def test_filter_pip_and_conda( which="pip", selector="unix", identifier="1d5d7757", + origin=(p,), ), }, "linux-ppc64le": { @@ -625,6 +660,7 @@ def test_filter_pip_and_conda( which="pip", selector="unix", identifier="1d5d7757", + origin=(p,), ), }, "osx-64": { @@ -633,6 +669,7 @@ def test_filter_pip_and_conda( which="pip", selector="unix", identifier="1d5d7757", + origin=(p,), ), }, "osx-arm64": { @@ -641,6 +678,7 @@ def test_filter_pip_and_conda( which="pip", selector="unix", identifier="1d5d7757", + origin=(p,), ), }, }, @@ -651,12 +689,14 @@ def test_filter_pip_and_conda( which="conda", selector="unix", identifier="f78244dc", + origin=(p,), ), "pip": Spec( name="common_package", which="pip", selector="unix", identifier="f78244dc", + origin=(p,), ), }, "linux-aarch64": { @@ -665,12 +705,14 @@ def test_filter_pip_and_conda( which="conda", selector="unix", identifier="f78244dc", + origin=(p,), ), "pip": Spec( name="common_package", which="pip", selector="unix", identifier="f78244dc", + origin=(p,), ), }, "linux-ppc64le": { @@ -679,12 +721,14 @@ def test_filter_pip_and_conda( which="conda", selector="unix", identifier="f78244dc", + origin=(p,), ), "pip": Spec( name="common_package", which="pip", selector="unix", identifier="f78244dc", + origin=(p,), ), }, "osx-64": { @@ -693,12 +737,14 @@ def test_filter_pip_and_conda( which="conda", selector="unix", identifier="f78244dc", + origin=(p,), ), "pip": Spec( name="common_package", which="pip", selector="unix", identifier="f78244dc", + origin=(p,), ), }, "osx-arm64": { @@ -707,12 +753,14 @@ def test_filter_pip_and_conda( which="conda", selector="unix", identifier="f78244dc", + origin=(p,), ), "pip": Spec( name="common_package", which="pip", selector="unix", identifier="f78244dc", + origin=(p,), ), }, }, @@ -723,6 +771,7 @@ def test_filter_pip_and_conda( which="conda", selector="linux64", identifier="1599d575", + origin=(p,), ), }, "win-64": { @@ -731,6 +780,7 @@ def test_filter_pip_and_conda( which="pip", selector="win64", identifier="46630b59", + origin=(p,), ), }, }, @@ -796,6 +846,7 @@ def test_duplicates_with_version( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", @@ -803,18 +854,21 @@ def test_duplicates_with_version( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", which="conda", selector="linux64", identifier="dd6a8aaf", + origin=(p,), ), Spec( name="foo", which="pip", selector="linux64", identifier="dd6a8aaf", + origin=(p,), ), ], "bar": [ @@ -822,11 +876,13 @@ def test_duplicates_with_version( name="bar", which="conda", identifier="08fd8713", + origin=(p,), ), Spec( name="bar", which="pip", identifier="08fd8713", + origin=(p,), ), ], } @@ -840,6 +896,7 @@ def test_duplicates_with_version( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), "pip": Spec( name="foo", @@ -847,6 +904,7 @@ def test_duplicates_with_version( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), }, }, @@ -856,11 +914,13 @@ def test_duplicates_with_version( name="bar", which="conda", identifier="08fd8713", + origin=(p,), ), "pip": Spec( name="bar", which="pip", identifier="08fd8713", + origin=(p,), ), }, }, @@ -905,6 +965,7 @@ def test_duplicates_different_platforms( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", @@ -912,6 +973,7 @@ def test_duplicates_different_platforms( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", @@ -919,6 +981,7 @@ def test_duplicates_different_platforms( selector="linux", pin="<=2", identifier="ecd4baa6", + origin=(p,), ), Spec( name="foo", @@ -926,6 +989,7 @@ def test_duplicates_different_platforms( selector="linux", pin="<=2", identifier="ecd4baa6", + origin=(p,), ), ], } @@ -938,12 +1002,14 @@ def test_duplicates_different_platforms( which="conda", pin=">1,<=2", identifier="c292b98a", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin=">1,<=2", identifier="c292b98a", + origin=(p,), ), }, "linux-aarch64": { @@ -953,6 +1019,7 @@ def test_duplicates_different_platforms( selector="linux", pin="<=2", identifier="ecd4baa6", + origin=(p,), ), "pip": Spec( name="foo", @@ -960,6 +1027,7 @@ def test_duplicates_different_platforms( selector="linux", pin="<=2", identifier="ecd4baa6", + origin=(p,), ), }, "linux-ppc64le": { @@ -969,6 +1037,7 @@ def test_duplicates_different_platforms( selector="linux", pin="<=2", identifier="ecd4baa6", + origin=(p,), ), "pip": Spec( name="foo", @@ -976,6 +1045,7 @@ def test_duplicates_different_platforms( selector="linux", pin="<=2", identifier="ecd4baa6", + origin=(p,), ), }, }, @@ -1032,6 +1102,7 @@ def test_expand_none_with_different_platforms( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", @@ -1039,18 +1110,21 @@ def test_expand_none_with_different_platforms( selector="linux64", pin=">1", identifier="c292b98a", + origin=(p,), ), Spec( name="foo", which="conda", pin="<3", identifier="5eb93b8c", + origin=(p,), ), Spec( name="foo", which="pip", pin="<3", identifier="5eb93b8c", + origin=(p,), ), ], } @@ -1063,12 +1137,14 @@ def test_expand_none_with_different_platforms( which="conda", pin=">1,<3", identifier="c292b98a", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin=">1,<3", identifier="c292b98a", + origin=(p,), ), }, "linux-aarch64": { @@ -1077,12 +1153,14 @@ def test_expand_none_with_different_platforms( which="conda", pin="<3", identifier="5eb93b8c", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin="<3", identifier="5eb93b8c", + origin=(p,), ), }, "linux-ppc64le": { @@ -1091,12 +1169,14 @@ def test_expand_none_with_different_platforms( which="conda", pin="<3", identifier="5eb93b8c", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin="<3", identifier="5eb93b8c", + origin=(p,), ), }, "osx-64": { @@ -1105,12 +1185,14 @@ def test_expand_none_with_different_platforms( which="conda", pin="<3", identifier="5eb93b8c", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin="<3", identifier="5eb93b8c", + origin=(p,), ), }, "osx-arm64": { @@ -1119,12 +1201,14 @@ def test_expand_none_with_different_platforms( which="conda", pin="<3", identifier="5eb93b8c", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin="<3", identifier="5eb93b8c", + origin=(p,), ), }, "win-64": { @@ -1133,12 +1217,14 @@ def test_expand_none_with_different_platforms( which="conda", pin="<3", identifier="5eb93b8c", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin="<3", identifier="5eb93b8c", + origin=(p,), ), }, }, @@ -1191,12 +1277,14 @@ def test_different_pins_on_conda_and_pip( which="conda", pin="<1", identifier="17e5d607", + origin=(p,), ), Spec( name="foo", which="pip", pin=">1", identifier="17e5d607", + origin=(p,), ), ], } @@ -1210,12 +1298,14 @@ def test_different_pins_on_conda_and_pip( which="conda", pin="<1", identifier="17e5d607", + origin=(p,), ), "pip": Spec( name="foo", which="pip", pin=">1", identifier="17e5d607", + origin=(p,), ), }, }, @@ -1620,6 +1710,7 @@ def test_pip_and_conda_different_name_on_linux64( which="conda", selector="linux64", identifier="c292b98a", + origin=(p,), ), ], "cuquantum": [ @@ -1628,6 +1719,7 @@ def test_pip_and_conda_different_name_on_linux64( which="pip", selector="linux64", identifier="c292b98a", + origin=(p,), ), ], } @@ -1641,6 +1733,7 @@ def test_pip_and_conda_different_name_on_linux64( which="conda", selector="linux64", identifier="c292b98a", + origin=(p,), ), }, }, @@ -1651,6 +1744,7 @@ def test_pip_and_conda_different_name_on_linux64( which="pip", selector="linux64", identifier="c292b98a", + origin=(p,), ), }, }, @@ -1687,11 +1781,13 @@ def test_parse_requirements_with_ignore_pin( name="foo", which="conda", identifier="17e5d607", + origin=(p,), ), Spec( name="foo", which="pip", identifier="17e5d607", + origin=(p,), ), ], } @@ -1725,11 +1821,13 @@ def test_parse_requirements_with_skip_dependency( name="baz", which="conda", identifier="08fd8713", + origin=(p,), ), Spec( name="baz", which="pip", identifier="08fd8713", + origin=(p,), ), ], } @@ -1757,6 +1855,7 @@ def test_pin_star_cuda(toml_or_yaml: Literal["toml", "yaml"], tmp_path: Path) -> selector="linux64", pin="* cuda*", identifier="c292b98a", + origin=(p,), ), Spec( name="qsimcirq", @@ -1764,6 +1863,7 @@ def test_pin_star_cuda(toml_or_yaml: Literal["toml", "yaml"], tmp_path: Path) -> selector="arm64", pin="* cpu*", identifier="489f33e0", + origin=(p,), ), ], } @@ -1797,12 +1897,14 @@ def test_parse_requirements_with_overwrite_pins( which="conda", pin="=1", identifier="17e5d607", + origin=(p,), ), Spec( name="foo", which="pip", pin="=1", identifier="17e5d607", + origin=(p,), ), ], "bar": [ @@ -1811,6 +1913,7 @@ def test_parse_requirements_with_overwrite_pins( which="conda", pin="* cpu*", identifier="5eb93b8c", + origin=(p,), ), ], } @@ -1845,12 +1948,14 @@ def test_duplicate_names_different_platforms( which="pip", selector="arm64", identifier="1b26c5b2", + origin=(p,), ), Spec( name="ray", which="pip", selector="linux64", identifier="dd6a8aaf", + origin=(p,), ), ], "ray-core": [ @@ -1859,6 +1964,7 @@ def test_duplicate_names_different_platforms( which="conda", selector="linux64", identifier="dd6a8aaf", + origin=(p,), ), ], } @@ -1991,6 +2097,7 @@ def test_pip_with_pinning_special_case_wildcard( which="pip", pin="* cuda*", identifier="17e5d607", + origin=(p1,), ), }, }, @@ -2047,6 +2154,7 @@ def test_pip_with_pinning_special_case_git_repo( which="pip", pin="@ git+https://github.com/python-adaptive/adaptive.git@main", identifier="17e5d607", + origin=(p1,), ), }, }, @@ -2082,12 +2190,14 @@ def test_not_equal( which="conda", pin="!=1.0.0,<2", identifier="17e5d607", + origin=(p1,), ), "pip": Spec( name="adaptive", which="pip", pin="!=1.0.0,<2", identifier="17e5d607", + origin=(p1,), ), }, }, @@ -2114,8 +2224,13 @@ def test_dot_in_package_name( requirements = parse_requirements(p1, verbose=False) assert requirements.requirements == { "ruamel.yaml": [ - Spec(name="ruamel.yaml", which="conda", identifier="17e5d607"), - Spec(name="ruamel.yaml", which="pip", identifier="17e5d607"), + Spec( + name="ruamel.yaml", + which="conda", + identifier="17e5d607", + origin=(p1,), + ), + Spec(name="ruamel.yaml", which="pip", identifier="17e5d607", origin=(p1,)), ], } @@ -2249,6 +2364,7 @@ def test_pip_dep_with_extras( pin=None, identifier="17e5d607", selector=None, + origin=(p,), ), }, }, @@ -2260,6 +2376,7 @@ def test_pip_dep_with_extras( pin=None, identifier="17e5d607", selector=None, + origin=(p,), ), }, }, @@ -2464,3 +2581,130 @@ def test_optional_dependencies_with_version_specifier( ) assert resolved.keys() == {"adaptive"} assert resolved["adaptive"][None]["conda"].pin == "=0.13.2" + + +@pytest.mark.parametrize("toml_or_yaml", ["toml", "yaml"]) +def test_origin_in_spec( + tmp_path: Path, + toml_or_yaml: Literal["toml", "yaml"], +) -> None: + d1 = tmp_path / "dir1" + d1.mkdir() + f1 = d1 / "requirements.yaml" + f1.write_text("dependencies:\n - numpy\n - conda: mumps") + + d2 = tmp_path / "dir2" + d2.mkdir() + f2 = d2 / "requirements.yaml" + f2.write_text("dependencies:\n - pip: pandas\n - numpy") + f1 = maybe_as_toml(toml_or_yaml, f1) + f2 = maybe_as_toml(toml_or_yaml, f2) + + requirements = parse_requirements(f1, f2, verbose=False) + assert requirements.requirements == { + "numpy": [ + Spec( + name="numpy", + which="conda", + pin=None, + identifier="17e5d607", + selector=None, + origin=(f1,), + ), + Spec( + name="numpy", + which="pip", + pin=None, + identifier="17e5d607", + selector=None, + origin=(f1,), + ), + Spec( + name="numpy", + which="conda", + pin=None, + identifier="9e467fa1", + selector=None, + origin=(f2,), + ), + Spec( + name="numpy", + which="pip", + pin=None, + identifier="9e467fa1", + selector=None, + origin=(f2,), + ), + ], + "mumps": [ + Spec( + name="mumps", + which="conda", + pin=None, + identifier="5eb93b8c", + selector=None, + origin=(f1,), + ), + ], + "pandas": [ + Spec( + name="pandas", + which="pip", + pin=None, + identifier="08fd8713", + selector=None, + origin=(f2,), + ), + ], + } + + resolved = resolve_conflicts( + requirements.requirements, + requirements.platforms, + ) + assert resolved == { + "numpy": { + None: { + "conda": Spec( + name="numpy", + which="conda", + pin=None, + identifier="17e5d607", + selector=None, + origin=(f1, f2), + ), + "pip": Spec( + name="numpy", + which="pip", + pin=None, + identifier="17e5d607", + selector=None, + origin=(f1, f2), + ), + }, + }, + "mumps": { + None: { + "conda": Spec( + name="mumps", + which="conda", + pin=None, + identifier="5eb93b8c", + selector=None, + origin=(f1,), + ), + }, + }, + "pandas": { + None: { + "pip": Spec( + name="pandas", + which="pip", + pin=None, + identifier="08fd8713", + selector=None, + origin=(f2,), + ), + }, + }, + } diff --git a/unidep/_cli.py b/unidep/_cli.py index e820d6e1..49aee1a3 100755 --- a/unidep/_cli.py +++ b/unidep/_cli.py @@ -30,6 +30,8 @@ parse_local_dependencies, parse_requirements, ) +from unidep._pixi import generate_pixi_toml +from unidep._pixi_lock import pixi_lock_command from unidep._setuptools_integration import ( filter_python_dependencies, get_python_dependencies, @@ -65,7 +67,7 @@ def _get_help_string(self, action: argparse.Action) -> str | None: from argparse import HelpFormatter as _HelpFormatter # type: ignore[assignment] _DEP_FILES = "`requirements.yaml` or `pyproject.toml`" -CondaExecutable = Literal["conda", "mamba", "micromamba"] +CondaExecutable = Literal["conda", "mamba", "micromamba", "pixi"] def _add_common_args( # noqa: PLR0912, C901 @@ -275,7 +277,7 @@ def _add_extra_flags( ) -def _parse_args() -> argparse.Namespace: +def _parse_args() -> argparse.Namespace: # noqa: PLR0915 parser = argparse.ArgumentParser( description="Unified Conda and Pip requirements management.", formatter_class=_HelpFormatter, @@ -286,7 +288,8 @@ def _parse_args() -> argparse.Namespace: merge_help = ( f"Combine multiple (or a single) {_DEP_FILES}" " files into a" - " single Conda installable `environment.yaml` file." + " single Conda installable `environment.yaml` file" + " or Pixi installable `pixi.toml` file." ) merge_example = ( " Example usage: `unidep merge --directory . --depth 1 --output environment.yaml`" # noqa: E501 @@ -304,9 +307,9 @@ def _parse_args() -> argparse.Namespace: parser_merge.add_argument( "-o", "--output", - type=Path, - default="environment.yaml", - help="Output file for the conda environment, by default `environment.yaml`", + default=None, + help="Output file for the conda environment, by default `environment.yaml`" + " or `pixi.toml` if `--pixi` is used", ) parser_merge.add_argument( "-n", @@ -329,6 +332,11 @@ def _parse_args() -> argparse.Namespace: " `- numpy # [linux]` becomes `sel(linux): numpy`, if `comment` then" " it remains `- numpy # [linux]`, by default `sel`", ) + parser_merge.add_argument( + "--pixi", + action="store_true", + help="Generate a `pixi.toml` file instead of `environment.yaml`", + ) _add_common_args( parser_merge, { @@ -492,6 +500,56 @@ def _parse_args() -> argparse.Namespace: ) _add_extra_flags(parser_lock, "conda-lock lock", "conda-lock", "--micromamba") + # Subparser for the 'pixi-lock' command + pixi_lock_help = ( + "Generate a global `pixi.lock` file for a collection of" + f" {_DEP_FILES}" + " files. Additionally, create individual" + f" `pixi.lock` files for each {_DEP_FILES} file" + " consistent with the global lock file." + ) + pixi_lock_example = ( + " Example usage: `unidep pixi-lock --directory ./projects` to generate" + f" pixi lock files for all {_DEP_FILES}" + " files in the `./projects`" + " directory. Use `--only-global` to generate only the global lock file." + ) + + parser_pixi_lock = subparsers.add_parser( + "pixi-lock", + help=pixi_lock_help, + description=pixi_lock_help + pixi_lock_example, + formatter_class=_HelpFormatter, + ) + + parser_pixi_lock.add_argument( + "--only-global", + action="store_true", + help="Only generate the global lock file", + ) + parser_pixi_lock.add_argument( + "--lockfile", + type=Path, + default="pixi.lock", + help="Specify a path for the global lockfile (default: `pixi.lock`" + " in current directory). Path should be relative, e.g.," + " `--lockfile ./locks/pixi.lock`.", + ) + _add_common_args( + parser_pixi_lock, + { + "directory", + "file-alt", + "verbose", + "platform", + "depth", + "ignore-pin", + "skip-dependency", + "overwrite-pin", + }, + ) + _add_extra_flags(parser_pixi_lock, "pixi lock", "pixi-lock", "--platform") + # Subparser for the 'pip-compile' command pip_compile_help = ( "Generate a fully pinned `requirements.txt` file from one or more" @@ -933,7 +991,7 @@ def _pip_install_local( subprocess.run(pip_command, check=True) -def _install_command( # noqa: PLR0912, PLR0915 +def _install_command( # noqa: C901, PLR0912, PLR0915 *files: Path, conda_executable: CondaExecutable | None, conda_env_name: str | None, @@ -991,7 +1049,26 @@ def _install_command( # noqa: PLR0912, PLR0915 skip_pip = True skip_conda = True - if env_spec.conda and not skip_conda: + if skip_conda: + pass + elif conda_executable == "pixi": + print("🔮 Installing conda dependencies with `pixi`") + generate_pixi_toml( + resolved, + project_name=None, + channels=requirements.channels, + platforms=platforms, + output_file="pixi.toml", + verbose=verbose, + ) + # Install dependencies using pixi + if not dry_run: + subprocess.run(["pixi", "install"], check=True) # noqa: S607 + # Optionally, handle local packages + # if not skip_local: + # _install_local_packages_with_pixi(...) + return # Exit after handling pixi + elif env_spec.conda: assert conda_executable is not None channel_args = ["--override-channels"] if env_spec.channels else [] for channel in env_spec.channels: @@ -1268,18 +1345,23 @@ def _merge_command( directory: Path, files: list[Path] | None, name: str, - output: Path, + output: str | Path | None, stdout: bool, selector: Literal["sel", "comment"], platforms: list[Platform], ignore_pins: list[str], skip_dependencies: list[str], overwrite_pins: list[str], + pixi: bool, verbose: bool, ) -> None: # pragma: no cover # When using stdout, suppress verbose output verbose = verbose and not stdout + if output is None: + output = "environment.yaml" if not pixi else "pixi.toml" + output = Path(output) + if files: # ignores depth and directory! found_files = files else: @@ -1306,13 +1388,23 @@ def _merge_command( platforms, optional_dependencies=requirements.optional_dependencies, ) + output_file = None if stdout else output + if pixi: + generate_pixi_toml( + resolved, + project_name=name, + channels=requirements.channels, + platforms=requirements.platforms, + output_file=output_file, + verbose=verbose, + ) + return env_spec = create_conda_env_specification( resolved, requirements.channels, platforms, selector=selector, ) - output_file = None if stdout else output write_conda_environment_file(env_spec, output_file, name, verbose=verbose) if output_file: found_files_str = ", ".join(f"`{f}`" for f in found_files) @@ -1480,7 +1572,7 @@ def _pip_subcommand( return escape_unicode(separator).join(pip_dependencies) -def main() -> None: +def main() -> None: # noqa: PLR0912 """Main entry point for the command-line tool.""" args = _parse_args() @@ -1497,6 +1589,7 @@ def main() -> None: ignore_pins=args.ignore_pin, skip_dependencies=args.skip_dependency, overwrite_pins=args.overwrite_pin, + pixi=args.pixi, verbose=args.verbose, ) elif args.command == "pip": # pragma: no cover @@ -1596,6 +1689,19 @@ def main() -> None: extra_flags=args.extra_flags, lockfile=args.lockfile, ) + elif args.command == "pixi-lock": + pixi_lock_command( + depth=args.depth, + directory=args.directory, + files=args.file or None, + platforms=args.platform, + verbose=args.verbose, + only_global=args.only_global, + ignore_pins=args.ignore_pin, + skip_dependencies=args.skip_dependency, + overwrite_pins=args.overwrite_pin, + extra_flags=args.extra_flags, + ) elif args.command == "pip-compile": # pragma: no cover if args.platform and len(args.platform) > 1: print( diff --git a/unidep/_conda_env.py b/unidep/_conda_env.py index 6f08448b..4239f247 100644 --- a/unidep/_conda_env.py +++ b/unidep/_conda_env.py @@ -15,7 +15,7 @@ from unidep._conflicts import ( VersionConflictError, - _maybe_new_spec_with_combined_pinnings, + _maybe_new_spec_with_combined_pinnings_and_origins, ) from unidep.platform_definitions import ( PLATFORM_SELECTOR_MAP, @@ -111,7 +111,7 @@ def _resolve_multiple_platform_conflicts( specs, (first_platform, *_) = zip(*spec_to_platforms.items()) first, *others = specs try: - spec = _maybe_new_spec_with_combined_pinnings(specs) # type: ignore[arg-type] + spec = _maybe_new_spec_with_combined_pinnings_and_origins(specs) # type: ignore[arg-type] except VersionConflictError: # We have a conflict, select the first one. msg = ( diff --git a/unidep/_conda_lock.py b/unidep/_conda_lock.py index 07c52a84..ead2b1e3 100644 --- a/unidep/_conda_lock.py +++ b/unidep/_conda_lock.py @@ -116,8 +116,9 @@ def _conda_lock_global( selector="comment", platforms=platforms, ignore_pins=ignore_pins, - overwrite_pins=overwrite_pins, skip_dependencies=skip_dependencies, + overwrite_pins=overwrite_pins, + pixi=False, verbose=verbose, ) _run_conda_lock( diff --git a/unidep/_conflicts.py b/unidep/_conflicts.py index 3ad8c749..9d529e13 100644 --- a/unidep/_conflicts.py +++ b/unidep/_conflicts.py @@ -78,12 +78,18 @@ def _pop_unused_platforms_and_maybe_expand_none( platform_data.pop(_platform) -def _maybe_new_spec_with_combined_pinnings( +def _maybe_new_spec_with_combined_pinnings_and_origins( specs: list[Spec], ) -> Spec: pinned_specs = [m for m in specs if m.pin is not None] + combined_origin = tuple(sorted({p for s in specs for p in s.origin})) if len(pinned_specs) == 1: - return pinned_specs[0] + if len(combined_origin) == 1: + return pinned_specs[0] + # If there is only one pinned spec, but the origins are different, + # we need to create a new spec with the combined origin. + return pinned_specs[0]._replace(origin=combined_origin) + if len(pinned_specs) > 1: first = pinned_specs[0] pins = [m.pin for m in pinned_specs] @@ -93,9 +99,15 @@ def _maybe_new_spec_with_combined_pinnings( which=first.which, pin=pin, identifier=first.identifier, # should I create a new one? + origin=combined_origin, ) # Flatten the list + assert len(pinned_specs) == 0 + if len(combined_origin) > 1: + # If there are no pinned specs, but the origins are different, + # we need to create a new spec with the combined origin. + return specs[0]._replace(origin=combined_origin) return specs[0] @@ -106,7 +118,7 @@ def _combine_pinning_within_platform( for _platform, packages in data.items(): reduced_data[_platform] = {} for which, specs in packages.items(): - spec = _maybe_new_spec_with_combined_pinnings(specs) + spec = _maybe_new_spec_with_combined_pinnings_and_origins(specs) reduced_data[_platform][which] = spec return reduced_data diff --git a/unidep/_dependencies_parsing.py b/unidep/_dependencies_parsing.py index 2565b567..6642ceb9 100644 --- a/unidep/_dependencies_parsing.py +++ b/unidep/_dependencies_parsing.py @@ -109,6 +109,7 @@ def _parse_dependency( ignore_pins: list[str], overwrite_pins: dict[str, str | None], skip_dependencies: list[str], + origin: Path, ) -> list[Spec]: name, pin, selector = parse_package_str(dependency) if name in ignore_pins: @@ -127,10 +128,10 @@ def _parse_dependency( identifier_hash = _identifier(identifier, selector) if which == "both": return [ - Spec(name, "conda", pin, identifier_hash, selector), - Spec(name, "pip", pin, identifier_hash, selector), + Spec(name, "conda", pin, identifier_hash, selector, origin=(origin,)), + Spec(name, "pip", pin, identifier_hash, selector, origin=(origin,)), ] - return [Spec(name, which, pin, identifier_hash, selector)] + return [Spec(name, which, pin, identifier_hash, selector, origin=(origin,))] class ParsedRequirements(NamedTuple): @@ -269,11 +270,13 @@ def _update_data_structures( seen: set[PathWithExtras], # modified in place yaml: YAML, is_nested: bool, + origin: Path, verbose: bool = False, ) -> None: if verbose: print(f"📄 Parsing `{path_with_extras.path_with_extras}`") data = _load(path_with_extras.path, yaml) + data["_origin"] = origin datas.append(data) _move_local_optional_dependencies_to_local_dependencies( data=data, # modified in place @@ -307,6 +310,7 @@ def _update_data_structures( all_extras=all_extras, # modified in place seen=seen, # modified in place yaml=yaml, + origin=origin, verbose=verbose, ) @@ -383,6 +387,7 @@ def _add_local_dependencies( all_extras: list[list[str]], seen: set[PathWithExtras], yaml: YAML, + origin: Path, verbose: bool = False, ) -> None: try: @@ -414,6 +419,7 @@ def _add_local_dependencies( yaml=yaml, verbose=verbose, is_nested=True, + origin=origin, ) @@ -465,6 +471,7 @@ def parse_requirements( yaml=yaml, verbose=verbose, is_nested=False, + origin=path_with_extras.path, ) assert len(datas) == len(all_extras) @@ -489,6 +496,7 @@ def parse_requirements( ignore_pins, overwrite_pins_map, skip_dependencies, + origin=data["_origin"], ) for opt_name, opt_deps in data.get("optional_dependencies", {}).items(): if opt_name in _extras or "*" in _extras: @@ -500,6 +508,7 @@ def parse_requirements( overwrite_pins_map, skip_dependencies, is_optional=True, + origin=data["_origin"], ) return ParsedRequirements( @@ -536,6 +545,7 @@ def _add_dependencies( skip_dependencies: list[str], *, is_optional: bool = False, + origin: Path, ) -> int: for i, dep in enumerate(dependencies): identifier += 1 @@ -549,6 +559,7 @@ def _add_dependencies( ignore_pins, overwrite_pins_map, skip_dependencies, + origin, ) for spec in specs: _check_allowed_local_dependency(spec.name, is_optional) @@ -566,6 +577,7 @@ def _add_dependencies( ignore_pins, overwrite_pins_map, skip_dependencies, + origin, ) for spec in specs: _check_allowed_local_dependency(spec.name, is_optional) diff --git a/unidep/_pixi.py b/unidep/_pixi.py new file mode 100644 index 00000000..325ad620 --- /dev/null +++ b/unidep/_pixi.py @@ -0,0 +1,205 @@ +from __future__ import annotations + +import sys +from pathlib import Path +from typing import TYPE_CHECKING, Any + +from unidep._conda_env import _extract_conda_pip_dependencies +from unidep.utils import identify_current_platform + +if TYPE_CHECKING: + from unidep.platform_definitions import CondaPip, Platform, Spec + +try: # pragma: no cover + if sys.version_info >= (3, 11): + import tomllib + else: + import tomli as tomllib + HAS_TOML = True +except ImportError: # pragma: no cover + HAS_TOML = False + + +def generate_pixi_toml( + resolved_dependencies: dict[str, dict[Platform | None, dict[CondaPip, Spec]]], + project_name: str | None, + channels: list[str], + platforms: list[Platform], + output_file: str | Path | None = "pixi.toml", + *, + verbose: bool = False, +) -> None: + pixi_data = _initialize_pixi_data(channels, platforms, project_name) + _process_dependencies(pixi_data, resolved_dependencies) + _write_pixi_toml(pixi_data, output_file, verbose=verbose) + + +def _initialize_pixi_data( + channels: list[str], + platforms: list[Platform], + project_name: str | None, +) -> dict[str, dict[str, Any]]: + pixi_data: dict[str, dict[str, Any]] = {} + if not platforms: + platforms = [identify_current_platform()] + # Include extra configurations from pyproject.toml + sections = _parse_pixi_sections_from_pyproject() + pixi_data.update(sections) + + # Set 'project' section + pixi_data.setdefault("project", {}) + pixi_data["project"].setdefault("name", project_name or Path.cwd().name) + pixi_data["project"].setdefault("platforms", platforms) + pixi_data["project"].setdefault("channels", channels) + + # Initialize dependencies sections + pixi_data.setdefault("dependencies", {}) + pixi_data.setdefault("pypi-dependencies", {}) + pixi_data.setdefault("target", {}) # For platform-specific dependencies + + return pixi_data + + +def _format_pin(pin: str) -> Any: + parts = pin.split() + if len(parts) == 2: # noqa: PLR2004 + return {"version": parts[0], "build": parts[1]} + return pin + + +def _group_by_origin( + resolved_deps: dict[str, dict[Platform | None, dict[CondaPip, Spec]]], +) -> dict[Path, dict[str, dict[Platform | None, dict[CondaPip, Spec]]]]: + groups: dict[Path, dict[str, dict[Platform | None, dict[CondaPip, Spec]]]] = {} + for pkg_name, platform_map in resolved_deps.items(): + for plat, manager_map in platform_map.items(): + for manager, spec in manager_map.items(): + for origin in spec.origin: + # Normalize origin to a Path object + origin_path = Path(origin) + groups.setdefault(origin_path, {}) + groups[origin_path].setdefault(pkg_name, {}) + groups[origin_path][pkg_name].setdefault(plat, {}) + groups[origin_path][pkg_name][plat][manager] = spec + return groups + + +def _process_dependencies( # noqa: PLR0912 + pixi_data: dict[str, dict[str, Any]], + resolved_dependencies: dict[str, dict[Platform | None, dict[CondaPip, Spec]]], +) -> None: + """Process the resolved dependencies and update the pixi manifest data. + + This function first groups the resolved dependencies by origin (using + _group_by_origin) and then creates a separate feature (under the "feature" + key in pixi_data) for each origin. The feature name is derived using the + parent directory's stem of the origin file. + + After creating the per-origin features, if the manifest does not yet have an + "environments" table, we automatically add one with: + - a "default" environment that includes all features, and + - one environment per feature (with the feature name as the sole member). + """ + # --- Step 1: Group by origin and create per-origin features --- + origin_groups = _group_by_origin(resolved_dependencies) + features = pixi_data.setdefault("feature", {}) + + for origin_path, group_deps in origin_groups.items(): + # Derive a feature name from the parent folder of the origin file. + feature_name = origin_path.resolve().parent.stem + + # Initialize the feature entry. + feature_entry: dict[str, Any] = { + "dependencies": {}, + "pypi-dependencies": {}, + "target": {}, + } + + # Extract conda and pip dependencies from the grouped data. + group_conda, group_pip = _extract_conda_pip_dependencies(group_deps) + + # Process conda dependencies for this feature. + for pkg_name, platform_to_spec in group_conda.items(): + for _platform, spec in platform_to_spec.items(): + pin = spec.pin or "*" + pin = _format_pin(pin) + if _platform is None: + feature_entry["dependencies"][pkg_name] = pin + else: + target = feature_entry["target"].setdefault(_platform, {}) + deps = target.setdefault("dependencies", {}) + deps[pkg_name] = pin + + # Process pip dependencies for this feature. + for pkg_name, platform_to_spec in group_pip.items(): + for _platform, spec in platform_to_spec.items(): + pin = spec.pin or "*" + if _platform is None: + feature_entry["pypi-dependencies"][pkg_name] = pin + else: + target = feature_entry["target"].setdefault(_platform, {}) + deps = target.setdefault("pypi-dependencies", {}) + deps[pkg_name] = pin + + # Remove empty sections. + if not feature_entry["dependencies"]: + del feature_entry["dependencies"] + if not feature_entry["pypi-dependencies"]: + del feature_entry["pypi-dependencies"] + if not feature_entry["target"]: + del feature_entry["target"] + + # Save this feature entry. + features[feature_name] = feature_entry + + # --- Step 2: Automatically add the environments table if not already defined --- + if "environments" not in pixi_data: + all_features = list(features.keys()) + pixi_data["environments"] = {} + # The "default" environment will include all features. + pixi_data["environments"]["default"] = all_features + # Also create one environment per feature. + for feat in all_features: + # Environment names cannot use _, only lowercase letters, digits, and - + name = feature_name_to_env_name(feat) + pixi_data["environments"][name] = [feat] + + +def feature_name_to_env_name(feature_name: str) -> str: + """Convert a feature name to a valid environment name.""" + return feature_name.replace("_", "-") + + +def _write_pixi_toml( + pixi_data: dict[str, dict[str, Any]], + output_file: str | Path | None, + *, + verbose: bool, +) -> None: + try: + import tomli_w + except ImportError: # pragma: no cover + msg = ( + "❌ `tomli_w` is required to write TOML files." + " Install it with `pip install tomli_w`." + ) + raise ImportError(msg) from None + + # Write pixi.toml file + if output_file is not None: + with open(output_file, "wb") as f: # noqa: PTH123 + tomli_w.dump(pixi_data, f) + else: + # to stdout + tomli_w.dump(pixi_data, sys.stdout.buffer) + if verbose: + print(f"✅ Generated pixi.toml at {output_file}") + + +def _parse_pixi_sections_from_pyproject() -> dict[str, Any]: + pyproject_path = Path("pyproject.toml") + if not pyproject_path.exists(): + return {} + with pyproject_path.open("rb") as f: + pyproject_data = tomllib.load(f) + return pyproject_data.get("tool", {}).get("unidep", {}).get("pixi", {}) diff --git a/unidep/_pixi_lock.py b/unidep/_pixi_lock.py new file mode 100644 index 00000000..0e452f94 --- /dev/null +++ b/unidep/_pixi_lock.py @@ -0,0 +1,307 @@ +from __future__ import annotations + +import shutil +import subprocess +import sys +from typing import TYPE_CHECKING, Any, NamedTuple + +from ruamel.yaml import YAML + +from unidep._dependencies_parsing import find_requirements_files +from unidep._pixi import feature_name_to_env_name +from unidep.utils import add_comment_to_file, change_directory + +if TYPE_CHECKING: + from pathlib import Path + + from unidep.platform_definitions import CondaPip, Platform + + +def _run_pixi_lock( + pixi_toml: Path, + pixi_lock_output: Path, + *, + extra_flags: list[str], +) -> None: + if shutil.which("pixi") is None: + msg = ( + "Cannot find `pixi`." + " Please install it, see the documentation" + " at https://pixi.sh/latest/" + ) + raise RuntimeError(msg) + if pixi_lock_output.exists(): + print(f"🗑️ Removing existing `{pixi_lock_output}`") + pixi_lock_output.unlink() + + cmd = ["pixi", "lock", "--manifest-path", str(pixi_toml), *extra_flags] + print(f"🔒 Locking dependencies with `{' '.join(cmd)}`\n") + try: + with change_directory(pixi_toml.parent): + subprocess.run(cmd, check=True, text=True) + # Optionally process the lock file if needed + add_comment_to_file( + pixi_lock_output, + extra_lines=[ + "#", + "# This environment can be installed with", + "# `pixi install`", + "# This file is a `pixi.lock` file generated via `unidep`.", + "# For details see https://pixi.sh/", + ], + ) + except subprocess.CalledProcessError as e: + print("❌ Error occurred:\n", e) + print("Return code:", e.returncode) + print("Output:", e.output) + print("Error Output:", e.stderr) + sys.exit(1) + + +def _pixi_lock_global( + *, + depth: int, + directory: Path, + files: list[Path] | None, + platforms: list[Platform], + verbose: bool, + ignore_pins: list[str], + skip_dependencies: list[str], + overwrite_pins: list[str], + extra_flags: list[str], +) -> Path: + """Generate a pixi.lock file for the global dependencies.""" + from unidep._cli import _merge_command + + if files: + directory = files[0].parent + + pixi_toml = directory / "pixi.toml" + pixi_lock_output = directory / "pixi.lock" + _merge_command( + depth=depth, + directory=directory, + files=files, + name="myenv", + output=pixi_toml, + stdout=False, + selector="comment", + platforms=platforms, + ignore_pins=ignore_pins, + skip_dependencies=skip_dependencies, + overwrite_pins=overwrite_pins, + pixi=True, + verbose=verbose, + ) + _run_pixi_lock( + pixi_toml, + pixi_lock_output, + extra_flags=extra_flags, + ) + print("✅ Global dependencies locked successfully in `pixi.lock`.") + return pixi_toml.with_name("pixi.lock") + + +class PixiLockSpec(NamedTuple): + """A specification of the pixi lock file.""" + + packages: dict[tuple[CondaPip, Platform, str], list[dict[str, Any]]] + dependencies: dict[tuple[CondaPip, Platform, str], set[str]] + channels: list[dict[str, str]] + indexes: list[str] + + +def _check_consistent_lock_files( + global_lock_file: Path, + sub_lock_files: list[Path], +) -> list[str]: + yaml = YAML(typ="safe") + with global_lock_file.open() as fp: + global_data = yaml.load(fp) + + global_packages = set() + environments = global_data.get("environments", {}) + for env_data in environments.values(): + for packages_list in env_data.get("packages", {}).values(): + for pkg_entry in packages_list: + # pkg_entry is a dict like {'conda': 'url'} + for url in pkg_entry.values(): + global_packages.add(url) + + mismatches = [] + for lock_file in sub_lock_files: + with lock_file.open() as fp: + data = yaml.load(fp) + + sub_packages = set() + environments = data.get("environments", {}) + for env_data in environments.values(): + for packages_list in env_data.get("packages", {}).values(): + for pkg_entry in packages_list: + for url in pkg_entry.values(): + sub_packages.add(url) + + if not sub_packages.issubset(global_packages): + missing = sub_packages - global_packages + mismatches.append( + f"Packages {missing} in {lock_file} not found in global lock file.", + ) + + return mismatches + + +def _generate_sub_lock_file( + feature_name: str, + global_lock_data: dict[str, Any], + yaml_obj: YAML, + output_dir: Path, +) -> Path: + """Generate a sub-lock file for a given feature. + + Parameters + ---------- + feature_name + The name of the feature (derived from the parent folder's stem). + global_lock_data + The global lock file data as a dict. + yaml_obj + A ruamel.yaml YAML instance for dumping. + output_dir + The directory where the sublock file should be written. + + Returns + ------- + - The Path to the newly written sub-lock file. + + The new lock file will contain a single environment ("default") whose contents + are exactly the environment for the given feature in the global lock file. It + also includes only the package entries from the global "packages" list that are + used by that environment. + + """ + # Look up the environment for the given feature. + envs = global_lock_data.get("environments", {}) + env_name = feature_name_to_env_name(feature_name) + env_data = envs.get(env_name) + if env_data is None: + msg = f"Feature '{feature_name}' not found in the global lock file." + raise ValueError(msg) + + # Create a new lock dictionary with version and a single env renamed "default". + new_lock = { + "version": global_lock_data.get("version"), + "environments": {"default": env_data}, + } + + # Collect all URLs from the environment's package list. + used_urls = set() + # The environment data is expected to have a "packages" key mapping each platform + # to a list of package entry dicts. + env_packages = env_data.get("packages", {}) + for pkg_list in env_packages.values(): + for pkg_entry in pkg_list: + # Assume each pkg_entry is a dict with one key: either "conda" or "pypi" + for url in pkg_entry.values(): + used_urls.add(url) + + # Filter the global packages list to include only those entries used in this env. + global_packages = global_lock_data.get("packages", []) + filtered_packages = [ + pkg + for pkg in global_packages + if (pkg.get("conda") in used_urls) or (pkg.get("pypi") in used_urls) + ] + new_lock["packages"] = filtered_packages + + # Write the new lock file into output_dir as "pixi.lock" + output_file = output_dir / "pixi.lock" + with output_file.open("w") as f: + yaml_obj.dump(new_lock, f) + return output_file + + +def pixi_lock_command( + *, + depth: int, + directory: Path, + files: list[Path] | None, + platforms: list[Platform], + verbose: bool, + only_global: bool, + ignore_pins: list[str], + skip_dependencies: list[str], + overwrite_pins: list[str], + extra_flags: list[str], +) -> None: + """Generate a pixi.lock file for a collection of dependencies. + + This command first creates a global lock file (using _pixi_lock_global). + Then, if neither only_global is True nor specific files were passed, it scans + for requirements files in subdirectories. For each such file, it derives a + feature name from the parent directory's stem and generates a sub-lock file + that contains a single environment called "default" built from the corresponding + environment in the global lock file. + """ + # Process extra flags (assume they are prefixed with "--") + if extra_flags: + assert extra_flags[0] == "--" + extra_flags = extra_flags[1:] + if verbose: + print(f"📝 Extra flags for `pixi lock`: {extra_flags}") + + # Step 1: Generate the global lock file. + global_lock_file = _pixi_lock_global( + depth=depth, + directory=directory, + files=files, + platforms=platforms, + verbose=verbose, + ignore_pins=ignore_pins, + overwrite_pins=overwrite_pins, + skip_dependencies=skip_dependencies, + extra_flags=extra_flags, + ) + # If only_global or specific files were provided, do not generate sublock files. + if only_global or files: + return + + # Step 2: Load the global lock file. + yaml_obj = YAML(typ="rt") + with global_lock_file.open() as fp: + global_lock_data = yaml_obj.load(fp) + + # Step 3: Find all requirements files in subdirectories. + found_files = find_requirements_files(directory, depth) + sub_lock_files = [] + for req_file in found_files: + # Skip files in the root directory. + if req_file.parent == directory: + continue + + # Derive feature name from the parent directory's stem. + feature_name = req_file.resolve().parent.stem + if verbose: + print( + f"🔍 Processing sublock for feature '{feature_name}' from file: {req_file}", # noqa: E501, + ) + sublock_file = _generate_sub_lock_file( + feature_name=feature_name, + global_lock_data=global_lock_data, + yaml_obj=yaml_obj, + output_dir=req_file.parent, + ) + + print(f"📝 Generated sublock file for '{req_file}': {sublock_file}") + sub_lock_files.append(sublock_file) + + # Step 3: Check consistency between the global and the sublock files. + mismatches = _check_consistent_lock_files( + global_lock_file=global_lock_file, + sub_lock_files=sub_lock_files, + ) + if not mismatches: + print("✅ Analyzed all lock files and found no inconsistencies.") + else: + print("❌ Mismatches found:") + for mismatch in mismatches: + print(mismatch) diff --git a/unidep/platform_definitions.py b/unidep/platform_definitions.py index a4151c31..31517534 100644 --- a/unidep/platform_definitions.py +++ b/unidep/platform_definitions.py @@ -6,7 +6,10 @@ from __future__ import annotations import sys -from typing import NamedTuple, cast +from typing import TYPE_CHECKING, NamedTuple, cast + +if TYPE_CHECKING: + from pathlib import Path if sys.version_info >= (3, 8): from typing import Literal, get_args @@ -120,6 +123,7 @@ class Spec(NamedTuple): identifier: str | None = None # can be of type `Selector` but also space separated string of `Selector`s selector: str | None = None + origin: tuple[Path, ...] = () def platforms(self) -> list[Platform] | None: """Return the platforms for this dependency.""" diff --git a/unidep/utils.py b/unidep/utils.py index a9ceb2fb..9f4ae19a 100644 --- a/unidep/utils.py +++ b/unidep/utils.py @@ -6,13 +6,15 @@ from __future__ import annotations import codecs +import os import platform import re import sys import warnings from collections import defaultdict +from contextlib import contextmanager from pathlib import Path -from typing import Any, NamedTuple, cast +from typing import Any, Generator, NamedTuple, cast from unidep._version import __version__ from unidep.platform_definitions import ( @@ -360,3 +362,14 @@ def get_package_version(package_name: str) -> str | None: return pkg_resources.get_distribution(package_name).version except pkg_resources.DistributionNotFound: return None + + +@contextmanager +def change_directory(new_path: str | Path) -> Generator[None, None, None]: + """A context manager to change the current working directory.""" + original_path = os.getcwd() # noqa: PTH109 + try: + os.chdir(new_path) + yield + finally: + os.chdir(original_path)