From f97da413507870c26b762a596380cd49b470d3ad Mon Sep 17 00:00:00 2001 From: "google-labs-jules[bot]" <161369871+google-labs-jules[bot]@users.noreply.github.com> Date: Fri, 20 Jun 2025 04:10:44 +0000 Subject: [PATCH 1/5] Fix typos and grammar in documentation This commit addresses various typos, grammatical errors, punctuation issues, and inconsistencies throughout the documentation files. The goal is to improve clarity and readability. --- docs/README.md | 16 +-- docs/_includes/py_console_script_binary.md | 22 ++-- docs/coverage.md | 4 +- docs/devguide.md | 28 ++-- docs/environment-variables.md | 30 ++--- docs/extending.md | 12 +- docs/gazelle.md | 4 +- docs/getting-started.md | 10 +- docs/glossary.md | 10 +- docs/index.md | 22 ++-- docs/precompiling.md | 42 +++--- docs/pypi/circular-dependencies.md | 18 +-- docs/pypi/download-workspace.md | 12 +- docs/pypi/download.md | 68 +++++----- docs/pypi/index.md | 6 +- docs/pypi/lock.md | 2 +- docs/pypi/patch.md | 6 +- docs/pypi/use.md | 42 +++--- docs/repl.md | 2 +- docs/support.md | 22 ++-- docs/toolchains.md | 146 ++++++++++----------- 21 files changed, 262 insertions(+), 262 deletions(-) diff --git a/docs/README.md b/docs/README.md index d98be41232..456f1cfd64 100644 --- a/docs/README.md +++ b/docs/README.md @@ -1,14 +1,14 @@ # rules_python Sphinx docs generation The docs for rules_python are generated using a combination of Sphinx, Bazel, -and Readthedocs.org. The Markdown files in source control are unlikely to render +and Read the Docs. The Markdown files in source control are unlikely to render properly without the Sphinx processing step because they rely on Sphinx and MyST-specific Markdown functionality. The actual sources that Sphinx consumes are in this directory, with Stardoc -generating additional sources or Sphinx. +generating additional sources for Sphinx. -Manually building the docs isn't necessary -- readthedocs.org will +Manually building the docs isn't necessary -- Read the Docs will automatically build and deploy them when commits are pushed to the repo. ## Generating docs for development @@ -31,8 +31,8 @@ equivalent bazel command if desired. ### Installing ibazel The `ibazel` tool can be used to automatically rebuild the docs as you -development them. See the [ibazel docs](https://github.com/bazelbuild/bazel-watcher) for -how to install it. The quick start for linux is: +develop them. See the [ibazel docs](https://github.com/bazelbuild/bazel-watcher) for +how to install it. The quick start for Linux is: ``` sudo apt install npm @@ -57,9 +57,9 @@ docs/. The Sphinx configuration is `docs/conf.py`. See https://www.sphinx-doc.org/ for details about the configuration file. -## Readthedocs configuration +## Read the Docs configuration -There's two basic parts to the readthedocs configuration: +There's two basic parts to the Read the Docs configuration: * `.readthedocs.yaml`: This configuration file controls most settings, such as the OS version used to build, Python version, dependencies, what Bazel @@ -69,4 +69,4 @@ There's two basic parts to the readthedocs configuration: controls additional settings such as permissions, what versions are published, when to publish changes, etc. -For more readthedocs configuration details, see docs.readthedocs.io. +For more Read the Docs configuration details, see docs.readthedocs.io. diff --git a/docs/_includes/py_console_script_binary.md b/docs/_includes/py_console_script_binary.md index d327091630..08d931773d 100644 --- a/docs/_includes/py_console_script_binary.md +++ b/docs/_includes/py_console_script_binary.md @@ -1,8 +1,8 @@ This rule is to make it easier to generate `console_script` entry points as per Python [specification]. -Generate a `py_binary` target for a particular console_script `entry_point` -from a PyPI package, e.g. for creating an executable `pylint` target use: +Generate a `py_binary` target for a particular `console_script` entry_point +from a PyPI package. For example, to create an executable `pylint` target, use: ```starlark load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_console_script_binary") @@ -12,10 +12,10 @@ py_console_script_binary( ) ``` -#### Specifying extra dependencies +#### Specifying extra dependencies You can also specify extra dependencies and the -exact script name you want to call. It is useful for tools like `flake8`, `pylint`, -`pytest`, which have plugin discovery methods and discover dependencies from the +exact script name you want to call. This is useful for tools like `flake8`, `pylint`, +and `pytest`, which have plugin discovery methods and discover dependencies from the PyPI packages available in the `PYTHONPATH`. ```starlark load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_console_script_binary") @@ -44,13 +44,13 @@ load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_cons py_console_script_binary( name = "yamllint", pkg = "@pip//yamllint", - python_version = "3.9" + python_version = "3.9", ) ``` #### Adding a Shebang Line -You can specify a shebang line for the generated binary, useful for Unix-like +You can specify a shebang line for the generated binary. This is useful for Unix-like systems where the shebang line determines which interpreter is used to execute the script, per [PEP441]: @@ -70,12 +70,12 @@ Python interpreter is available in the environment. #### Using a specific Python Version directly from a Toolchain :::{deprecated} 1.1.0 -The toolchain specific `py_binary` and `py_test` symbols are aliases to the regular rules. -i.e. Deprecated `load("@python_versions//3.11:defs.bzl", "py_binary")` and `load("@python_versions//3.11:defs.bzl", "py_test")` +The toolchain-specific `py_binary` and `py_test` symbols are aliases to the regular rules. +For example, `load("@python_versions//3.11:defs.bzl", "py_binary")` and `load("@python_versions//3.11:defs.bzl", "py_test")` are deprecated. -You should instead specify the desired python version with `python_version`; see above example. +You should instead specify the desired Python version with `python_version`; see the example above. ::: -Alternatively, the [`py_console_script_binary.binary_rule`] arg can be passed +Alternatively, the `py_console_script_binary.binary_rule` arg can be passed the version-bound `py_binary` symbol, or any other `py_binary`-compatible rule of your choosing: ```starlark diff --git a/docs/coverage.md b/docs/coverage.md index 3e0e67368c..023173d396 100644 --- a/docs/coverage.md +++ b/docs/coverage.md @@ -9,7 +9,7 @@ when configuring toolchains. ## Enabling `rules_python` coverage support Enabling the coverage support bundled with `rules_python` just requires setting an -argument when registerting toolchains. +argument when registering toolchains. For Bzlmod: @@ -32,7 +32,7 @@ python_register_toolchains( This will implicitly add the version of `coverage` bundled with `rules_python` to the dependencies of `py_test` rules when `bazel coverage` is run. If a target already transitively depends on a different version of -`coverage`, then behavior is undefined -- it is undefined which version comes + `coverage`, then the behavior is undefined -- it is undefined which version comes first in the import path. If you find yourself in this situation, then you'll need to manually configure coverage (see below). ::: diff --git a/docs/devguide.md b/docs/devguide.md index f233611cad..345907b374 100644 --- a/docs/devguide.md +++ b/docs/devguide.md @@ -1,7 +1,7 @@ # Dev Guide -This document covers tips and guidance for working on the rules_python code -base. A primary audience for it is first time contributors. +This document covers tips and guidance for working on the `rules_python` code +base. Its primary audience is first-time contributors. ## Running tests @@ -12,8 +12,8 @@ bazel test //... ``` And it will run all the tests it can find. The first time you do this, it will -probably take long time because various dependencies will need to be downloaded -and setup. Subsequent runs will be faster, but there are many tests, and some of +probably take a long time because various dependencies will need to be downloaded +and set up. Subsequent runs will be faster, but there are many tests, and some of them are slow. If you're working on a particular area of code, you can run just the tests in those directories instead, which can speed up your edit-run cycle. @@ -22,14 +22,14 @@ the tests in those directories instead, which can speed up your edit-run cycle. Most code should have tests of some sort. This helps us have confidence that refactors didn't break anything and that releases won't have regressions. -We don't require 100% test coverage, testing certain Bazel functionality is +We don't require 100% test coverage; testing certain Bazel functionality is difficult, and some edge cases are simply too hard to test or not worth the extra complexity. We try to judiciously decide when not having tests is a good idea. Tests go under `tests/`. They are loosely organized into directories for the particular subsystem or functionality they are testing. If an existing directory -doesn't seem like a good match for the functionality being testing, then it's +doesn't seem like a good match for the functionality being tested, then it's fine to create a new directory. Re-usable test helpers and support code go in `tests/support`. Tests don't need @@ -72,9 +72,9 @@ the rule. To have it support setting a new flag: An integration test is one that runs a separate Bazel instance inside the test. These tests are discouraged unless absolutely necessary because they are slow, -require much memory and CPU, and are generally harder to debug. Integration -tests are reserved for things that simple can't be tested otherwise, or for -simple high level verification tests. +require a lot of memory and CPU, and are generally harder to debug. Integration +tests are reserved for things that simply can't be tested otherwise, or for +simple high-level verification tests. Integration tests live in `tests/integration`. When possible, add to an existing integration test. @@ -98,9 +98,9 @@ integration test. ## Updating tool dependencies -It's suggested to routinely update the tool versions within our repo - some of the -tools are using requirement files compiled by `uv` and others use other means. In order -to have everything self-documented, we have a special target - -`//private:requirements.update`, which uses `rules_multirun` to run in sequence all -of the requirement updating scripts in one go. This can be done once per release as +It's suggested to routinely update the tool versions within our repo. Some of the +tools are using requirement files compiled by `uv`, and others use other means. In order +to have everything self-documented, we have a special target, +`//private:requirements.update`, which uses `rules_multirun` to run all +of the requirement-updating scripts in sequence in one go. This can be done once per release as we prepare for releases. diff --git a/docs/environment-variables.md b/docs/environment-variables.md index 8a51bcbfd2..8fa0a82321 100644 --- a/docs/environment-variables.md +++ b/docs/environment-variables.md @@ -5,16 +5,16 @@ This variable allows for additional arguments to be provided to the Python interpreter at bootstrap time when the `bash` bootstrap is used. If `RULES_PYTHON_ADDITIONAL_INTERPRETER_ARGS` were provided as `-Xaaa`, then the command -would be; +would be: ``` python -Xaaa /path/to/file.py ``` This feature is likely to be useful for the integration of debuggers. For example, -it would be possible to configure the `RULES_PYTHON_ADDITIONAL_INTERPRETER_ARGS` to -be set to `/path/to/debugger.py --port 12344 --file` resulting -in the command executed being; +it would be possible to configure `RULES_PYTHON_ADDITIONAL_INTERPRETER_ARGS` to +be set to `/path/to/debugger.py --port 12344 --file`, resulting +in the command executed being: ``` python /path/to/debugger.py --port 12345 --file /path/to/file.py @@ -38,18 +38,18 @@ stderr. When `1`, bzlmod extensions will print debug information about what they're doing. This is mostly useful for development to debug errors. -::: +:::: :::{envvar} RULES_PYTHON_DEPRECATION_WARNINGS -When `1`, the rules_python will warn users about deprecated functionality that will +When `1`, `rules_python` will warn users about deprecated functionality that will be removed in a subsequent major `rules_python` version. Defaults to `0` if unset. -::: +:::: ::::{envvar} RULES_PYTHON_ENABLE_PYSTAR -When `1`, the rules_python Starlark implementation of the core rules is used -instead of the Bazel-builtin rules. Note this requires Bazel 7+. Defaults +When `1`, the `rules_python` Starlark implementation of the core rules is used +instead of the Bazel-builtin rules. Note that this requires Bazel 7+. Defaults to `1`. :::{versionadded} 0.26.0 @@ -62,7 +62,7 @@ The default became `1` if unspecified ::::{envvar} RULES_PYTHON_ENABLE_PIPSTAR -When `1`, the rules_python Starlark implementation of the pypi/pip integration is used +When `1`, the `rules_python` Starlark implementation of the PyPI/pip integration is used instead of the legacy Python scripts. :::{versionadded} 1.5.0 @@ -95,8 +95,8 @@ exit. :::{envvar} RULES_PYTHON_GAZELLE_VERBOSE -When `1`, debug information from gazelle is printed to stderr. -::: +When `1`, debug information from Gazelle is printed to stderr. +:::: :::{envvar} RULES_PYTHON_PIP_ISOLATED @@ -125,13 +125,13 @@ Determines the verbosity of logging output for repo rules. Valid values: :::{envvar} RULES_PYTHON_REPO_TOOLCHAIN_VERSION_OS_ARCH -Determines the python interpreter platform to be used for a particular +Determines the Python interpreter platform to be used for a particular interpreter `(version, os, arch)` triple to be used in repository rules. -Replace the `VERSION_OS_ARCH` part with actual values when using, e.g. +Replace the `VERSION_OS_ARCH` part with actual values when using, e.g., `3_13_0_linux_x86_64`. The version values must have `_` instead of `.` and the os, arch values are the same as the ones mentioned in the `//python:versions.bzl` file. -::: +:::: :::{envvar} VERBOSE_COVERAGE diff --git a/docs/extending.md b/docs/extending.md index 387310e6cf..00018fbd74 100644 --- a/docs/extending.md +++ b/docs/extending.md @@ -41,10 +41,10 @@ wrappers around the keyword arguments eventually passed to the `rule()` function. These builder APIs give access to the _entire_ rule definition and allow arbitrary modifications. -This is level of control is powerful, but also volatile. A rule definition +This level of control is powerful but also volatile. A rule definition contains many details that _must_ change as the implementation changes. What is more or less likely to change isn't known in advance, but some general -rules are: +rules of thumb are: * Additive behavior to public attributes will be less prone to breaking. * Internal attributes that directly support a public attribute are likely @@ -55,7 +55,7 @@ rules are: ## Example: validating a source file -In this example, we derive from `py_library` a custom rule that verifies source +In this example, we derive a custom rule from `py_library` that verifies source code contains the word "snakes". It does this by: * Adding an implicit dependency on a checker program @@ -111,7 +111,7 @@ has_snakes_library = create_has_snakes_rule() ## Example: adding transitions -In this example, we derive from `py_binary` to force building for a particular +In this example, we derive a custom rule from `py_binary` to force building for a particular platform. We do this by: * Adding an additional output to the rule's cfg @@ -136,8 +136,8 @@ def create_rule(): r.cfg.add_output("//command_line_option:platforms") return r.build() -py_linux_binary = create_linux_binary_rule() +py_linux_binary = create_rule() ``` -Users can then use `py_linux_binary` the same as a regular py_binary. It will +Users can then use `py_linux_binary` the same as a regular `py_binary`. It will act as if `--platforms=//my/platforms:linux` was specified when building it. diff --git a/docs/gazelle.md b/docs/gazelle.md index 89f26d67bb..60b46faf2c 100644 --- a/docs/gazelle.md +++ b/docs/gazelle.md @@ -3,7 +3,7 @@ [Gazelle](https://github.com/bazelbuild/bazel-gazelle) is a build file generator for Bazel projects. It can create new `BUILD.bazel` files for a project that follows language conventions and update existing build files to include new sources, dependencies, and options. -Bazel may run Gazelle using the Gazelle rule, or it may be installed and run as a command line tool. +Bazel may run Gazelle using the Gazelle rule, or Gazelle may be installed and run as a command line tool. -See the documentation for Gazelle with rules_python in the {gh-path}`gazelle` +See the documentation for Gazelle with `rules_python` in the {gh-path}`gazelle` directory. diff --git a/docs/getting-started.md b/docs/getting-started.md index 7e7b88aa8a..d81d72f590 100644 --- a/docs/getting-started.md +++ b/docs/getting-started.md @@ -1,14 +1,14 @@ # Getting started -This doc is a simplified guide to help get started quickly. It provides +This document is a simplified guide to help you get started quickly. It provides a simplified introduction to having a working Python program for both `bzlmod` and the older way of using `WORKSPACE`. It assumes you have a `requirements.txt` file with your PyPI dependencies. -For more details information about configuring `rules_python`, see: +For more detailed information about configuring `rules_python`, see: * [Configuring the runtime](configuring-toolchains) -* [Configuring third party dependencies (pip/pypi)](./pypi/index) +* [Configuring third-party dependencies (pip/PyPI)](./pypi/index) * [API docs](api/index) ## Including dependencies @@ -32,7 +32,7 @@ use_repo(pip, "pypi") ### Using a WORKSPACE file -Using WORKSPACE is deprecated, but still supported, and a bit more involved than +Using `WORKSPACE` is deprecated but still supported, and it's a bit more involved than using Bzlmod. Here is a simplified setup to download the prebuilt runtimes. ```starlark @@ -72,7 +72,7 @@ pip_parse( ## "Hello World" -Once you've imported the rule set using either Bzlmod or WORKSPACE, you can then +Once you've imported the rule set using either Bzlmod or `WORKSPACE`, you can then load the core rules in your `BUILD` files with the following: ```starlark diff --git a/docs/glossary.md b/docs/glossary.md index 9afbcffb92..c9bd03fd0e 100644 --- a/docs/glossary.md +++ b/docs/glossary.md @@ -5,7 +5,7 @@ common attributes : Every rule has a set of common attributes. See Bazel's [Common attributes](https://bazel.build/reference/be/common-definitions#common-attributes) - for a complete listing + for a complete listing. in-build runtime : An in-build runtime is one where the Python runtime, and all its files, are @@ -21,9 +21,9 @@ which can be a significant number of files. platform runtime : A platform runtime is a Python runtime that is assumed to be installed on the -system where a Python binary runs, whereever that may be. For example, using `/usr/bin/python3` +system where a Python binary runs, wherever that may be. For example, using `/usr/bin/python3` as the interpreter is a platform runtime -- it assumes that, wherever the binary -runs (your local machine, a remote worker, within a container, etc), that path +runs (your local machine, a remote worker, within a container, etc.), that path is available. Such runtimes are _not_ part of a binary's runfiles. The main advantage of platform runtimes is they are lightweight insofar as @@ -42,8 +42,8 @@ rule callable accepted; refer to the respective API accepting this type. simple label -: A `str` or `Label` object but not a _direct_ `select` object. These usually - mean a string manipulation is occuring, which can't be done on `select` + A `str` or `Label` object but not a _direct_ `select` object. This usually + means a string manipulation is occurring, which can't be done on `select` objects. Such attributes are usually still configurable if an alias is used, and a reference to the alias is passed instead. diff --git a/docs/index.md b/docs/index.md index 82023f3ad8..25b423c6c3 100644 --- a/docs/index.md +++ b/docs/index.md @@ -1,6 +1,6 @@ # Python Rules for Bazel -`rules_python` is the home for 4 major components with varying maturity levels. +`rules_python` is the home for four major components with varying maturity levels. :::{topic} Core rules @@ -9,8 +9,8 @@ The core Python rules -- `py_library`, `py_binary`, `py_test`, support in Bazel. When using Bazel 6 (or earlier), the core rules are bundled into the Bazel binary, and the symbols -in this repository are simple aliases. On Bazel 7 and above `rules_python` uses -a separate Starlark implementation, +in this repository are simple aliases. On Bazel 7 and above, `rules_python` uses +a separate Starlark implementation; see {ref}`Migrating from the Bundled Rules` below. This repository follows @@ -21,12 +21,12 @@ outlined in the [support](support) page. :::{topic} PyPI integration -Package installation rules for integrating with PyPI and other SimpleAPI +Package installation rules for integrating with PyPI and other Simple API- compatible indexes. These rules work and can be used in production, but the cross-platform building that supports pulling PyPI dependencies for a target platform that is different -from the host platform is still in beta and the APIs that are subject to potential +from the host platform is still in beta, and the APIs that are subject to potential change are marked as `experimental`. ::: @@ -36,9 +36,9 @@ change are marked as `experimental`. `sphinxdocs` rules allow users to generate documentation using Sphinx powered by Bazel, with additional functionality for documenting Starlark and Bazel code. -The functionality is exposed because other projects find it useful, but -it is available as is and **the semantic versioning and -compatibility policy used by `rules_python` does not apply**. +The functionality is exposed because other projects find it useful, but +it is available "as is", and **the semantic versioning and +compatibility policy used by `rules_python` does not apply**. ::: @@ -47,7 +47,7 @@ compatibility policy used by `rules_python` does not apply**. `gazelle` plugin for generating `BUILD.bazel` files based on Python source code. -This is available as is and the semantic versioning used by `rules_python` does +This is available "as is", and the semantic versioning used by `rules_python` does not apply. ::: @@ -78,7 +78,7 @@ appropriate `load()` statements and rewrite uses of `native.py_*`. buildifier --lint=fix --warnings=native-py ``` -Currently, the `WORKSPACE` file needs to be updated manually as per +Currently, the `WORKSPACE` file needs to be updated manually as per [Getting started](getting-started). Note that Starlark-defined bundled symbols underneath @@ -87,7 +87,7 @@ by buildifier. ## Migrating to bzlmod -See {gh-path}`Bzlmod support ` for any behaviour differences between +See {gh-path}`Bzlmod support ` for any behavioral differences between `bzlmod` and `WORKSPACE`. diff --git a/docs/precompiling.md b/docs/precompiling.md index a46608f77e..8fbee46c17 100644 --- a/docs/precompiling.md +++ b/docs/precompiling.md @@ -1,6 +1,6 @@ # Precompiling -Precompiling is compiling Python source files (`.py` files) into byte code +Precompiling is compiling Python source files (`.py` files) into bytecode (`.pyc` files) at build time instead of runtime. Doing it at build time can improve performance by skipping that work at runtime. @@ -13,14 +13,14 @@ While precompiling helps runtime performance, it has two main costs: 1. Increasing the size (count and disk usage) of runfiles. It approximately double the count of the runfiles because for every `.py` file, there is also a `.pyc` file. Compiled files are generally around the same size as the - source files, so it approximately doubles the disk usage. + source files, so it approximately doubles disk usage. 2. Precompiling requires running an extra action at build time. While - compiling itself isn't that expensive, the overhead can become noticable + compiling itself isn't that expensive, the overhead can become noticeable as more files need to be compiled. ## Binary-level opt-in -Binary-level opt-in allows enabling precompiling on a per-target basic. This is +Binary-level opt-in allows enabling precompiling on a per-target basis. This is useful for situations such as: * Globally enabling precompiling in your `.bazelrc` isn't feasible. This may @@ -41,7 +41,7 @@ can use an opt-in or opt-out approach by setting its value: ## Pyc-only builds -A pyc-only build (aka "source less" builds) is when only `.pyc` files are +A pyc-only build (aka "sourceless" builds) is when only `.pyc` files are included; the source `.py` files are not included. To enable this, set @@ -55,8 +55,8 @@ The advantage of pyc-only builds are: The disadvantages are: * Error messages will be less precise because the precise line and offset - information isn't in an pyc file. -* pyc files are Python major-version specific. + information isn't in a pyc file. +* pyc files are Python major-version-specific. :::{note} pyc files are not a form of hiding source code. They are trivial to uncompile, @@ -75,11 +75,11 @@ mechanisms are available: the {bzl:attr}`precompiler` attribute. Arbitrary binaries are supported. * The execution requirements can be customized using `--@rules_python//tools/precompiler:execution_requirements`. This is a list - flag that can be repeated. Each entry is a key=value that is added to the + flag that can be repeated. Each entry is a `key=value` pair that is added to the execution requirements of the `PyCompile` action. Note that this flag - is specific to the rules_python precompiler. If a custom binary is used, + is specific to the `rules_python` precompiler. If a custom binary is used, this flag will have to be propagated from the custom binary using the - `testing.ExecutionInfo` provider; refer to the `py_interpreter_program` an + `testing.ExecutionInfo` provider; refer to the `py_interpreter_program` example. The default precompiler implementation is an asynchronous/concurrent implementation. If you find it has bugs or hangs, please report them. In the @@ -90,18 +90,18 @@ as well, but is less likely to have issues. The `execution_requirements` keys of most relevance are: * `supports-workers`: 1 or 0, to indicate if a regular persistent worker is desired. -* `supports-multiplex-workers`: 1 o 0, to indicate if a multiplexed persistent +* `supports-multiplex-workers`: `1` or `0`, to indicate if a multiplexed persistent worker is desired. -* `requires-worker-protocol`: json or proto; the rules_python precompiler - currently only supports json. -* `supports-multiplex-sandboxing`: 1 or 0, to indicate if sanboxing is of the +* `requires-worker-protocol`: `json` or `proto`; the `rules_python` precompiler + currently only supports `json`. +* `supports-multiplex-sandboxing`: `1` or `0`, to indicate if sandboxing of the worker is supported. -* `supports-worker-cancellation`: 1 or 1, to indicate if requests to the worker +* `supports-worker-cancellation`: `1` or `0`, to indicate if requests to the worker can be cancelled. Note that any execution requirements values can be specified in the flag. -## Known issues, caveats, and idiosyncracies +## Known issues, caveats, and idiosyncrasies * Precompiling requires Bazel 7+ with the Pystar rule implementation enabled. * Mixing rules_python PyInfo with Bazel builtin PyInfo will result in pyc files @@ -111,14 +111,14 @@ Note that any execution requirements values can be specified in the flag. causes the module to be found in the workspace source directory instead of within the binary's runfiles directory (where the pyc files are). This can usually be worked around by removing `sys.path[0]` (or otherwise ensuring the - runfiles directory comes before the repos source directory in `sys.path`). -* The pyc filename does not include the optimization level (e.g. - `foo.cpython-39.opt-2.pyc`). This works fine (it's all byte code), but also + runfiles directory comes before the repo's source directory in `sys.path`). +* The pyc filename does not include the optimization level (e.g., + `foo.cpython-39.opt-2.pyc`). This works fine (it's all bytecode), but also means the interpreter `-O` argument can't be used -- doing so will cause the interpreter to look for the non-existent `opt-N` named files. -* Targets with the same source files and different exec properites will result +* Targets with the same source files and different exec properties will result in action conflicts. This most commonly occurs when a `py_binary` and - `py_library` have the same source files. To fix, modify both targets so + a `py_library` have the same source files. To fix this, modify both targets so they have the same exec properties. If this is difficult because unsupported exec groups end up being passed to the Python rules, please file an issue to have those exec groups added to the Python rules. diff --git a/docs/pypi/circular-dependencies.md b/docs/pypi/circular-dependencies.md index d22f5b36a7..797b880562 100644 --- a/docs/pypi/circular-dependencies.md +++ b/docs/pypi/circular-dependencies.md @@ -3,10 +3,10 @@ # Circular dependencies -Sometimes PyPi packages contain dependency cycles -- for instance a particular -version `sphinx` (this is no longer the case in the latest version as of +Sometimes PyPI packages contain dependency cycles. For instance, a particular +version of `sphinx` (this is no longer the case in the latest version as of 2024-06-02) depends on `sphinxcontrib-serializinghtml`. When using them as -`requirement()`s, ala +`requirement()`s, like so: ```starlark py_binary( @@ -47,10 +47,10 @@ simultaneously. ) ``` -`pip_parse` supports fixing multiple cycles simultaneously, however cycles must -be distinct. `apache-airflow` for instance has dependency cycles with a number +`pip_parse` supports fixing multiple cycles simultaneously; however, cycles must +be distinct. `apache-airflow`, for instance, has dependency cycles with a number of its optional dependencies, which means those optional dependencies must all -be a part of the `airflow` cycle. For instance -- +be a part of the `airflow` cycle. For instance: ```starlark ... @@ -67,9 +67,9 @@ be a part of the `airflow` cycle. For instance -- Alternatively, one could resolve the cycle by removing one leg of it. -For example while `apache-airflow-providers-sqlite` is "baked into" the Airflow +For example, while `apache-airflow-providers-sqlite` is "baked into" the Airflow package, `apache-airflow-providers-postgres` is not and is an optional feature. -Rather than listing `apache-airflow[postgres]` in your `requirements.txt` which +Rather than listing `apache-airflow[postgres]` in your `requirements.txt`, which would expose a cycle via the extra, one could either _manually_ depend on `apache-airflow` and `apache-airflow-providers-postgres` separately as requirements. Bazel rules which need only `apache-airflow` can take it as a @@ -77,6 +77,6 @@ dependency, and rules which explicitly want to mix in `apache-airflow-providers-postgres` now can. Alternatively, one could use `rules_python`'s patching features to remove one -leg of the dependency manually. For instance by making +leg of the dependency manually, for instance, by making `apache-airflow-providers-postgres` not explicitly depend on `apache-airflow` or perhaps `apache-airflow-providers-common-sql`. diff --git a/docs/pypi/download-workspace.md b/docs/pypi/download-workspace.md index 48710095a4..5dfb0f257a 100644 --- a/docs/pypi/download-workspace.md +++ b/docs/pypi/download-workspace.md @@ -3,7 +3,7 @@ # Download (WORKSPACE) -This documentation page covers how to download the PyPI dependencies in the legacy `WORKSPACE` setup. +This documentation page covers how to download PyPI dependencies in the legacy `WORKSPACE` setup. To add pip dependencies to your `WORKSPACE`, load the `pip_parse` function and call it to create the central external repo and individual wheel external repos. @@ -27,7 +27,7 @@ install_deps() ## Interpreter selection -Note that pip parse runs before the Bazel before decides which Python toolchain to use, it cannot +Note that because `pip_parse` runs before Bazel decides which Python toolchain to use, it cannot enforce that the interpreter used to invoke `pip` matches the interpreter used to run `py_binary` targets. By default, `pip_parse` uses the system command `"python3"`. To override this, pass in the {attr}`pip_parse.python_interpreter` attribute or {attr}`pip_parse.python_interpreter_target`. @@ -44,9 +44,9 @@ your system `python` interpreter), you can force it to re-execute by running (per-os-arch-requirements)= ## Requirements for a specific OS/Architecture -In some cases you may need to use different requirements files for different OS, Arch combinations. +In some cases, you may need to use different requirements files for different OS and architecture combinations. This is enabled via the {attr}`pip_parse.requirements_by_platform` attribute. The keys of the -dictionary are labels to the file and the values are a list of comma separated target (os, arch) +dictionary are labels to the file, and the values are a list of comma-separated target (os, arch) tuples. For example: @@ -63,8 +63,8 @@ For example: requirements_lock = "requirements_lock.txt", ``` -In case of duplicate platforms, `rules_python` will raise an error as there has -to be unambiguous mapping of the requirement files to the (os, arch) tuples. +In case of duplicate platforms, `rules_python` will raise an error, as there has +to be an unambiguous mapping of the requirement files to the (os, arch) tuples. An alternative way is to use per-OS requirement attributes. ```starlark diff --git a/docs/pypi/download.md b/docs/pypi/download.md index 18d6699ab3..7f4e205d84 100644 --- a/docs/pypi/download.md +++ b/docs/pypi/download.md @@ -8,8 +8,8 @@ For WORKSPACE instructions see [here](./download-workspace). ::: To add PyPI dependencies to your `MODULE.bazel` file, use the `pip.parse` -extension, and call it to create the central external repo and individual wheel -external repos. Include in the `MODULE.bazel` the toolchain extension as shown +extension and call it to create the central external repo and individual wheel +external repos. Include the toolchain extension in the `MODULE.bazel` file as shown in the first bzlmod example above. ```starlark @@ -24,7 +24,7 @@ pip.parse( use_repo(pip, "my_deps") ``` -For more documentation, see the bzlmod examples under the {gh-path}`examples` folder or the documentation +For more documentation, see the Bzlmod examples under the {gh-path}`examples` folder or the documentation for the {obj}`@rules_python//python/extensions:pip.bzl` extension. :::note} @@ -42,7 +42,7 @@ difference. ## Interpreter selection -The {obj}`pip.parse` `bzlmod` extension by default uses the hermetic python toolchain for the host +The {obj}`pip.parse` `bzlmod` extension by default uses the hermetic Python toolchain for the host platform, but you can customize the interpreter using {attr}`pip.parse.python_interpreter` and {attr}`pip.parse.python_interpreter_target`. @@ -58,10 +58,10 @@ name]`. (per-os-arch-requirements)= ## Requirements for a specific OS/Architecture -In some cases you may need to use different requirements files for different OS, Arch combinations. -This is enabled via the `requirements_by_platform` attribute in `pip.parse` extension and the -{obj}`pip.parse` tag class. The keys of the dictionary are labels to the file and the values are a -list of comma separated target (os, arch) tuples. +In some cases, you may need to use different requirements files for different OS and architecture combinations. +This is enabled via the `requirements_by_platform` attribute in the `pip.parse` extension and the +{obj}`pip.parse` tag class. The keys of the dictionary are labels to the file, and the values are a +list of comma-separated target (os, arch) tuples. For example: ```starlark @@ -77,8 +77,8 @@ For example: requirements_lock = "requirements_lock.txt", ``` -In case of duplicate platforms, `rules_python` will raise an error as there has -to be unambiguous mapping of the requirement files to the (os, arch) tuples. +In case of duplicate platforms, `rules_python` will raise an error, as there has +to be an unambiguous mapping of the requirement files to the (os, arch) tuples. An alternative way is to use per-OS requirement attributes. ```starlark @@ -98,24 +98,24 @@ the lock file will be evaluated against, consider using the aforementioned ## Multi-platform support -Historically the {obj}`pip_parse` and {obj}`pip.parse` have been only downloading/building +Historically, the {obj}`pip_parse` and {obj}`pip.parse` have only been downloading/building Python dependencies for the host platform that the `bazel` commands are executed on. Over -the years people started needing support for building containers and usually that involves -fetching dependencies for a particular target platform that may be other than the host +the years, people started needing support for building containers, and usually, that involves +fetching dependencies for a particular target platform that may be different from the host platform. -Multi-platform support of cross-building the wheels can be done in two ways: +Multi-platform support for cross-building the wheels can be done in two ways: 1. using {attr}`experimental_index_url` for the {bzl:obj}`pip.parse` bzlmod tag class -2. using {attr}`pip.parse.download_only` setting. +2. using the {attr}`pip.parse.download_only` setting. :::{warning} -This will not for sdists with C extensions, but pure Python sdists may still work using the first +This will not work for sdists with C extensions, but pure Python sdists may still work using the first approach. ::: ### Using `download_only` attribute -Let's say you have 2 requirements files: +Let's say you have two requirements files: ``` # requirements.linux_x86_64.txt --platform=manylinux_2_17_x86_64 @@ -151,9 +151,9 @@ pip.parse( ) ``` -With this, the `pip.parse` will create a hub repository that is going to -support only two platforms - `cp39_osx_aarch64` and `cp39_linux_x86_64` and it -will only use `wheels` and ignore any sdists that it may find on the PyPI +With this, `pip.parse` will create a hub repository that is going to +support only two platforms - `cp39_osx_aarch64` and `cp39_linux_x86_64` - and it +will only use `wheels` and ignore any sdists that it may find on the PyPI- compatible indexes. :::{warning} @@ -162,7 +162,7 @@ multiple times. ::: :::{note} -This will only work for wheel-only setups, i.e. all of your dependencies need to have wheels +This will only work for wheel-only setups, i.e., all of your dependencies need to have wheels available on the PyPI index that you use. ::: @@ -173,9 +173,9 @@ Currently this is disabled by default, but you can turn it on using {envvar}`RULES_PYTHON_ENABLE_PIPSTAR` environment variable. ::: -In order to understand what dependencies to pull for a particular package +In order to understand what dependencies to pull for a particular package, `rules_python` parses the `whl` file [`METADATA`][metadata]. -Packages can express dependencies via `Requires-Dist` and they can add conditions using +Packages can express dependencies via `Requires-Dist`, and they can add conditions using "environment markers", which represent the Python version, OS, etc. While the PyPI integration provides reasonable defaults to support most @@ -198,8 +198,8 @@ additional keys, which become available during dependency evaluation. ### Bazel downloader and multi-platform wheel hub repository. :::{warning} -This is currently still experimental and whilst it has been proven to work in quite a few -environments, the APIs are still being finalized and there may be changes to the APIs for this +This is currently still experimental, and whilst it has been proven to work in quite a few +environments, the APIs are still being finalized, and there may be changes to the APIs for this feature without much notice. The issues that you can subscribe to for updates are: @@ -207,7 +207,7 @@ The issues that you can subscribe to for updates are: * {gh-issue}`1357` ::: -The {obj}`pip` extension supports pulling information from `PyPI` (or a compatible mirror) and it +The {obj}`pip` extension supports pulling information from `PyPI` (or a compatible mirror), and it will ensure that the [bazel downloader][bazel_downloader] is used for downloading the wheels. This provides the following benefits: @@ -222,7 +222,7 @@ To enable the feature specify {attr}`pip.parse.experimental_index_url` as shown the {gh-path}`examples/bzlmod/MODULE.bazel` example. Similar to [uv](https://docs.astral.sh/uv/configuration/indexes/), one can override the -index that is used for a single package. By default we first search in the index specified by +index that is used for a single package. By default, we first search in the index specified by {attr}`pip.parse.experimental_index_url`, then we iterate through the {attr}`pip.parse.experimental_extra_index_urls` unless there are overrides specified via {attr}`pip.parse.experimental_index_url_overrides`. @@ -235,12 +235,12 @@ Loading: 0 packages loaded ``` -This does not mean that `rules_python` is fetching the wheels eagerly, but it -rather means that it is calling the PyPI server to get the Simple API response +This does not mean that `rules_python` is fetching the wheels eagerly; rather, +it means that it is calling the PyPI server to get the Simple API response to get the list of all available source and wheel distributions. Once it has -got all of the available distributions, it will select the right ones depending +gotten all of the available distributions, it will select the right ones depending on the `sha256` values in your `requirements_lock.txt` file. If `sha256` hashes -are not present in the requirements file, we will fallback to matching by version +are not present in the requirements file, we will fall back to matching by version specified in the lock file. Fetching the distribution information from the PyPI allows `rules_python` to @@ -264,10 +264,10 @@ available flags: The [Bazel downloader](#bazel-downloader) usage allows for the Bazel [Credential Helper][cred-helper-design]. -Your python artifact registry may provide a credential helper for you. +Your Python artifact registry may provide a credential helper for you. Refer to your index's docs to see if one is provided. -The simplest form of a credential helper is a bash script that accepts an arg and spits out JSON to +The simplest form of a credential helper is a bash script that accepts an argument and spits out JSON to stdout. For a service like Google Artifact Registry that uses ['Basic' HTTP Auth][rfc7617] and does not provide a credential helper that conforms to the [spec][cred-helper-spec], the script might look like: @@ -285,7 +285,7 @@ echo ' }' echo '}' ``` -Configure Bazel to use this credential helper for your python index `example.com`: +Configure Bazel to use this credential helper for your Python index `example.com`: ``` # .bazelrc diff --git a/docs/pypi/index.md b/docs/pypi/index.md index c300124398..c32bafc609 100644 --- a/docs/pypi/index.md +++ b/docs/pypi/index.md @@ -3,11 +3,11 @@ # Using PyPI -Using PyPI packages (aka "pip install") involves the following main steps. +Using PyPI packages (aka "pip install") involves the following main steps: 1. [Generating requirements file](./lock) -2. Installing third party packages in [bzlmod](./download) or [WORKSPACE](./download-workspace). -3. [Using third party packages as dependencies](./use) +2. Installing third-party packages in [bzlmod](./download) or [WORKSPACE](./download-workspace). +3. [Using third-party packages as dependencies](./use) With the advanced topics covered separately: * Dealing with [circular dependencies](./circular-dependencies). diff --git a/docs/pypi/lock.md b/docs/pypi/lock.md index c9376036fb..ebb63e3b76 100644 --- a/docs/pypi/lock.md +++ b/docs/pypi/lock.md @@ -13,7 +13,7 @@ Currently `rules_python` only supports `requirements.txt` format. Generally, when working on a Python project, you'll have some dependencies that themselves have other dependencies. You might also specify dependency bounds instead of specific versions. So you'll need to generate a full list of all transitive dependencies and pinned versions for every dependency. -Typically, you'd have your project dependencies specified in `pyproject.toml` or `requirements.in` and generate the full pinned list of dependencies in `requirements_lock.txt`, which you can manage with the {obj}`compile_pip_requirements`: +Typically, you'd have your project dependencies specified in `pyproject.toml` or `requirements.in` and generate the full pinned list of dependencies in `requirements_lock.txt`, which you can manage with {obj}`compile_pip_requirements`: ```starlark load("@rules_python//python:pip.bzl", "compile_pip_requirements") diff --git a/docs/pypi/patch.md b/docs/pypi/patch.md index f341bd1091..7ed2bb9c60 100644 --- a/docs/pypi/patch.md +++ b/docs/pypi/patch.md @@ -3,8 +3,8 @@ # Patching wheels -Sometimes the wheels have to be patched to: -* Workaround the lack of a standard `site-packages` layout ({gh-issue}`2156`) -* Include certain PRs of your choice on top of wheels and avoid building from sdist, +Sometimes, wheels have to be patched to: +* Workaround the lack of a standard `site-packages` layout ({gh-issue}`2156`). +* Include certain PRs of your choice on top of wheels and avoid building from sdist. You can patch the wheels by using the {attr}`pip.override.patches` attribute. diff --git a/docs/pypi/use.md b/docs/pypi/use.md index 7a16b7d9e9..a4bbd076bc 100644 --- a/docs/pypi/use.md +++ b/docs/pypi/use.md @@ -3,10 +3,10 @@ # Use in BUILD.bazel files -Once you have setup the dependencies, you are ready to start using them in your `BUILD.bazel` -files. If you haven't done so yet, set it up by following the following docs: +Once you have set up the dependencies, you are ready to start using them in your `BUILD.bazel` +files. If you haven't done so yet, set it up by following these docs: 1. [WORKSPACE](./download-workspace) -1. [bzlmod](./download) +2. [bzlmod](./download) To refer to targets in a hub repo `pypi`, you can do one of two things: ```starlark @@ -29,19 +29,19 @@ py_library( ) ``` -Note, that the usage of the `requirement` helper is not advised and can be problematic. See the +Note that the usage of the `requirement` helper is not advised and can be problematic. See the [notes below](#requirement-helper). -Note, that the hub repo contains the following targets for each package: -* `@pypi//numpy` which is a shorthand for `@pypi//numpy:numpy`. This is an {obj}`alias` to +Note that the hub repo contains the following targets for each package: +* `@pypi//numpy`, which is shorthand for `@pypi//numpy:numpy`. This is an {obj}`alias` to `@pypi//numpy:pkg`. * `@pypi//numpy:pkg` - the {obj}`py_library` target automatically generated by the repository rules. -* `@pypi//numpy:data` - the {obj}`filegroup` that is for all of the extra files that are included +* `@pypi//numpy:data` - the {obj}`filegroup` for all of the extra files that are included as data in the `pkg` target. -* `@pypi//numpy:dist_info` - the {obj}`filegroup` that is for all of the files in the `.distinfo` directory. -* `@pypi//numpy:whl` - the {obj}`filegroup` that is the `.whl` file itself which includes all of - the transitive dependencies via the {attr}`filegroup.data` attribute. +* `@pypi//numpy:dist_info` - the {obj}`filegroup` for all of the files in the `.distinfo` directory. +* `@pypi//numpy:whl` - the {obj}`filegroup` that is the `.whl` file itself, which includes all + transitive dependencies via the {attr}`filegroup.data` attribute. ## Entry points @@ -52,14 +52,14 @@ which can help you create a `py_binary` target for a particular console script e ## 'Extras' dependencies -Any 'extras' specified in the requirements lock file will be automatically added +Any "extras" specified in the requirements lock file will be automatically added as transitive dependencies of the package. In the example above, you'd just put `requirement("useful_dep")` or `@pypi//useful_dep`. ## Consuming Wheel Dists Directly -If you need to depend on the wheel dists themselves, for instance, to pass them -to some other packaging tool, you can get a handle to them with the +If you need to depend on the wheel dists themselves (for instance, to pass them +to some other packaging tool), you can get a handle to them with the `whl_requirement` macro. For example: ```starlark @@ -77,7 +77,7 @@ filegroup( ## Creating a filegroup of files within a whl The rule {obj}`whl_filegroup` exists as an easy way to extract the necessary files -from a whl file without the need to modify the `BUILD.bazel` contents of the +from a whl file without needing to modify the `BUILD.bazel` contents of the whl repositories generated via `pip_repository`. Use it similarly to the `filegroup` above. See the API docs for more information. @@ -104,16 +104,16 @@ py_library( ) ``` -The reason `requirement()` exists is to insulate from +The reason `requirement()` exists is to insulate users from changes to the underlying repository and label strings. However, those -labels have become directly used, so aren't able to easily change regardless. +labels have become directly used, so they aren't able to easily change regardless. -On the other hand, using `requirement()` helper has several drawbacks: +On the other hand, using the `requirement()` helper has several drawbacks: -- It doesn't work with `buildifier` -- It doesn't work with `buildozer` -- It adds extra layer on top of normal mechanisms to refer to targets. -- It does not scale well as each type of target needs a new macro to be loaded and imported. +- It doesn't work with `buildifier`. +- It doesn't work with `buildozer`. +- It adds an extra layer on top of normal mechanisms to refer to targets. +- It does not scale well, as each type of target needs a new macro to be loaded and imported. If you don't want to use `requirement()`, you can use the library labels directly instead. For `pip_parse`, the labels are of the following form: diff --git a/docs/repl.md b/docs/repl.md index edcf37e811..1434097fdf 100644 --- a/docs/repl.md +++ b/docs/repl.md @@ -1,6 +1,6 @@ # Getting a REPL or Interactive Shell -rules_python provides a REPL to help with debugging and developing. The goal of +`rules_python` provides a REPL to help with debugging and developing. The goal of the REPL is to present an environment identical to what a {bzl:obj}`py_binary` creates for your code. diff --git a/docs/support.md b/docs/support.md index 5e6de57fcb..ad943b3845 100644 --- a/docs/support.md +++ b/docs/support.md @@ -8,7 +8,7 @@ page for information on our development workflow. ## Supported rules_python Versions In general, only the latest version is supported. Backporting changes is -done on a best effort basis based on severity, risk of regressions, and +done on a best-effort basis based on severity, risk of regressions, and the willingness of volunteers. If you want or need particular functionality backported, then the best way @@ -33,24 +33,24 @@ for what versions are the rolling, active, and prior releases. ## Supported Python versions -As a general rule we test all released non-EOL Python versions. Different +As a general rule, we test all released non-EOL Python versions. Different interpreter versions may work but are not guaranteed. We are interested in staying compatible with upcoming unreleased versions, so if you see that things stop working, please create tickets or, more preferably, pull requests. ## Supported Platforms -We only support the platforms that our continuous integration jobs run, which -is Linux, Mac, and Windows. +We only support the platforms that our continuous integration jobs run on, which +are Linux, Mac, and Windows. -In order to better describe different support levels, the below acts as a rough +In order to better describe different support levels, the following acts as a rough guideline for different platform tiers: -* Tier 0 - The platforms that our CI runs: `linux_x86_64`, `osx_x86_64`, `RBE linux_x86_64`. -* Tier 1 - The platforms that are similar enough to what the CI runs: `linux_aarch64`, `osx_arm64`. - What is more, `windows_x86_64` is in this list as we run tests in CI but - developing for Windows is more challenging and features may come later to +* Tier 0 - The platforms that our CI runs on: `linux_x86_64`, `osx_x86_64`, `RBE linux_x86_64`. +* Tier 1 - The platforms that are similar enough to what the CI runs on: `linux_aarch64`, `osx_arm64`. + What is more, `windows_x86_64` is in this list, as we run tests in CI, but + developing for Windows is more challenging, and features may come later to this platform. -* Tier 2 - The rest of the platforms that may have varying level of support, e.g. +* Tier 2 - The rest of the platforms that may have a varying level of support, e.g., `linux_s390x`, `linux_ppc64le`, `windows_arm64`. :::{note} @@ -75,7 +75,7 @@ a series of releases to so users can still incrementally upgrade. See the ## Experimental Features -An experimental features is functionality that may not be ready for general +An experimental feature is functionality that may not be ready for general use and may change quickly and/or significantly. Such features are denoted in their name or API docs as "experimental". They may have breaking changes made at any time. diff --git a/docs/toolchains.md b/docs/toolchains.md index 668a458156..368c92e14b 100644 --- a/docs/toolchains.md +++ b/docs/toolchains.md @@ -4,24 +4,24 @@ (configuring-toolchains)= # Configuring Python toolchains and runtimes -This documents how to configure the Python toolchain and runtimes for different +This document explains how to configure the Python toolchain and runtimes for different use cases. ## Bzlmod MODULE configuration -How to configure `rules_python` in your MODULE.bazel file depends on how and why -you're using Python. There are 4 basic use cases: +How to configure `rules_python` in your `MODULE.bazel` file depends on how and why +you're using Python. There are four basic use cases: 1. A root module that always uses Python. For example, you're building a Python application. 2. A library module with dev-only uses of Python. For example, a Java project that only uses Python as part of testing itself. 3. A library module without version constraints. For example, a rule set with - Python build tools, but defers to the user as to what Python version is used + Python build tools, but it defers to the user as to what Python version is used for the tools. 4. A library module with version constraints. For example, a rule set with Python build tools, and the module requires a specific version of Python - be used with its tools. + to be used with its tools. ### Root modules @@ -51,7 +51,7 @@ python.toolchain(python_version = "3.12") ### Library modules A library module is a module that can show up in arbitrary locations in the -bzlmod module graph -- it's unknown where in the breadth-first search order the +Bzlmod module graph -- it's unknown where in the breadth-first search order the module will be relative to other modules. For example, `rules_python` is a library module. @@ -84,9 +84,9 @@ used for the Python programs it runs isn't chosen by the module itself. Instead, it's up to the root module to pick an appropriate version of Python. For this case, configuration is simple: just depend on `rules_python` and use -the normal `//python:py_binary.bzl` et al rules. There is no need to call -`python.toolchain` -- rules_python ensures _some_ Python version is available, -but more often the root module will specify some version. +the normal `//python:py_binary.bzl` et al. rules. There is no need to call +`python.toolchain` -- `rules_python` ensures _some_ Python version is available, +but more often, the root module will specify some version. ``` # MODULE.bazel @@ -108,7 +108,7 @@ specific Python version be used with its tools. This has some pros/cons: * It has higher build overhead because additional runtimes and libraries need to be downloaded, and Bazel has to keep additional configuration state. -To configure this, request the Python versions needed in MODULE.bazel and use +To configure this, request the Python versions needed in `MODULE.bazel` and use the version-aware rules for `py_binary`. ``` @@ -132,7 +132,7 @@ is most useful for two cases: 1. For submodules to ensure they run with the appropriate Python version 2. To allow incremental, per-target, upgrading to newer Python versions, - typically in a mono-repo situation. + typically in a monorepo situation. To configure a submodule with the version-aware rules, request the particular version you need when defining the toolchain: @@ -147,7 +147,7 @@ python.toolchain( use_repo(python) ``` -Then use the `@rules_python` repo in your BUILD file to explicity pin the Python version when calling the rule: +Then use the `@rules_python` repo in your `BUILD` file to explicitly pin the Python version when calling the rule: ```starlark # BUILD.bazel @@ -202,29 +202,29 @@ The `python.toolchain()` call makes its contents available under a repo named `python_X_Y`, where X and Y are the major and minor versions. For example, `python.toolchain(python_version="3.11")` creates the repo `@python_3_11`. Remember to call `use_repo()` to make repos visible to your module: -`use_repo(python, "python_3_11")` +`use_repo(python, "python_3_11")`. :::{deprecated} 1.1.0 -The toolchain specific `py_binary` and `py_test` symbols are aliases to the regular rules. -i.e. Deprecated `load("@python_versions//3.11:defs.bzl", "py_binary")` & `load("@python_versions//3.11:defs.bzl", "py_test")` +The toolchain-specific `py_binary` and `py_test` symbols are aliases to the regular rules. +For example, `load("@python_versions//3.11:defs.bzl", "py_binary")` & `load("@python_versions//3.11:defs.bzl", "py_test")` are deprecated. -Usages of them should be changed to load the regular rules directly; -i.e. Use `load("@rules_python//python:py_binary.bzl", "py_binary")` & `load("@rules_python//python:py_test.bzl", "py_test")` and then specify the `python_version` when using the rules corresponding to the python version you defined in your toolchain. {ref}`Library modules with version constraints` +Usages of them should be changed to load the regular rules directly. +For example, use `load("@rules_python//python:py_binary.bzl", "py_binary")` & `load("@rules_python//python:py_test.bzl", "py_test")` and then specify the `python_version` when using the rules corresponding to the Python version you defined in your toolchain. {ref}`Library modules with version constraints` ::: #### Toolchain usage in other rules -Python toolchains can be utilized in other bazel rules, such as `genrule()`, by +Python toolchains can be utilized in other Bazel rules, such as `genrule()`, by adding the `toolchains=["@rules_python//python:current_py_toolchain"]` attribute. You can obtain the path to the Python interpreter using the `$(PYTHON2)` and `$(PYTHON3)` ["Make" Variables](https://bazel.build/reference/be/make-variables). See the {gh-path}`test_current_py_toolchain ` target -for an example. We also make available `$(PYTHON2_ROOTPATH)` and `$(PYTHON3_ROOTPATH)` +for an example. We also make available `$(PYTHON2_ROOTPATH)` and `$(PYTHON3_ROOTPATH)`, which are Make Variable equivalents of `$(PYTHON2)` and `$(PYTHON3)` but for runfiles -locations. These will be helpful if you need to set env vars of binary/test rules +locations. These will be helpful if you need to set environment variables of binary/test rules while using [`--nolegacy_external_runfiles`](https://bazel.build/reference/command-line-reference#flag--legacy_external_runfiles). The original make variables still work in exec contexts such as genrules. @@ -246,9 +246,9 @@ existing attributes: ### Registering custom runtimes Because the python-build-standalone project has _thousands_ of prebuilt runtimes -available, rules_python only includes popular runtimes in its built in +available, `rules_python` only includes popular runtimes in its built-in configurations. If you want to use a runtime that isn't already known to -rules_python then {obj}`single_version_platform_override()` can be used to do +`rules_python`, then {obj}`single_version_platform_override()` can be used to do so. In short, it allows specifying an arbitrary URL and using custom flags to control when a runtime is used. @@ -287,21 +287,21 @@ config_setting( ``` Notes: -- While any URL and archive can be used, it's assumed their content looks how - a python-build-standalone archive looks. -- A "version aware" toolchain is registered, which means the Python version flag - must also match (e.g. `--@rules_python//python/config_settings:python_version=3.13.3` +- While any URL and archive can be used, it's assumed their content looks like + a python-build-standalone archive. +- A "version-aware" toolchain is registered, which means the Python version flag + must also match (e.g., `--@rules_python//python/config_settings:python_version=3.13.3` must be set -- see `minor_mapping` and `is_default` for controls and docs about version matching and selection). - The `target_compatible_with` attribute can be used to entirely specify the - arg of the same name the toolchain uses. + argument of the same name that the toolchain uses. - The labels in `target_settings` must be absolute; `@@` refers to the main repo. - The `target_settings` are `config_setting` targets, which means you can customize how matching occurs. :::{seealso} -See {obj}`//python/config_settings` for flags rules_python already defines -that can be used with `target_settings`. Some particular ones of note are: +See {obj}`//python/config_settings` for flags `rules_python` already defines +that can be used with `target_settings`. Some particular ones of note are {flag}`--py_linux_libc` and {flag}`--py_freethreaded`, among others. ::: @@ -312,7 +312,7 @@ Added support for custom platform names, `target_compatible_with`, and ### Using defined toolchains from WORKSPACE -It is possible to use toolchains defined in `MODULE.bazel` in `WORKSPACE`. For example +It is possible to use toolchains defined in `MODULE.bazel` in `WORKSPACE`. For example, the following `MODULE.bazel` and `WORKSPACE` provides a working {bzl:obj}`pip_parse` setup: ```starlark # File: WORKSPACE @@ -343,16 +343,16 @@ python.toolchain(python_version = "3.10") use_repo(python, "python_3_10", "python_3_10_host") ``` -Note, the user has to import the `*_host` repository to use the python interpreter in the -{bzl:obj}`pip_parse` and `whl_library` repository rules and once that is done +Note, the user has to import the `*_host` repository to use the Python interpreter in the +{bzl:obj}`pip_parse` and `whl_library` repository rules, and once that is done, users should be able to ensure the setting of the default toolchain even during the transition period when some of the code is still defined in `WORKSPACE`. ## Workspace configuration -To import rules_python in your project, you first need to add it to your +To import `rules_python` in your project, you first need to add it to your `WORKSPACE` file, using the snippet provided in the -[release you choose](https://github.com/bazel-contrib/rules_python/releases) +[release you choose](https://github.com/bazel-contrib/rules_python/releases). To depend on a particular unreleased version, you can do the following: @@ -403,15 +403,15 @@ pip_parse( ``` After registration, your Python targets will use the toolchain's interpreter during execution, but a system-installed interpreter -is still used to 'bootstrap' Python targets (see https://github.com/bazel-contrib/rules_python/issues/691). +is still used to "bootstrap" Python targets (see https://github.com/bazel-contrib/rules_python/issues/691). You may also find some quirks while using this toolchain. Please refer to [python-build-standalone documentation's _Quirks_ section](https://gregoryszorc.com/docs/python-build-standalone/main/quirks.html). ## Local toolchain It's possible to use a locally installed Python runtime instead of the regular prebuilt, remotely downloaded ones. A local toolchain contains the Python -runtime metadata (Python version, headers, ABI flags, etc) that the regular -remotely downloaded runtimes contain, which makes it possible to build e.g. C +runtime metadata (Python version, headers, ABI flags, etc.) that the regular +remotely downloaded runtimes contain, which makes it possible to build, e.g., C extensions (unlike the autodetecting and runtime environment toolchains). For simple cases, the {obj}`local_runtime_repo` and @@ -420,10 +420,10 @@ Python installation and create an appropriate Bazel definition from it. To do this, three pieces need to be wired together: 1. Specify a path or command to a Python interpreter (multiple can be defined). -2. Create toolchains for the runtimes in (1) -3. Register the toolchains created by (2) +2. Create toolchains for the runtimes in (1). +3. Register the toolchains created by (2). -The below is an example that will use `python3` from PATH to find the +The following is an example that will use `python3` from `PATH` to find the interpreter, then introspect its installation to generate a full toolchain. ```starlark @@ -474,7 +474,7 @@ Python versions and/or platforms to be configured in a single `MODULE.bazel`. Note that `register_toolchains` will insert the local toolchain earlier in the toolchain ordering, so it will take precedence over other registered toolchains. To better control when the toolchain is used, see [Conditionally using local -toolchains] +toolchains]. ### Conditionally using local toolchains @@ -483,22 +483,22 @@ ordering, which means it will usually be used no matter what. This can be problematic for CI (where it shouldn't be used), expensive for CI (CI must initialize/download the repository to determine its Python version), and annoying for iterative development (enabling/disabling it requires modifying -MODULE.bazel). +`MODULE.bazel`). These behaviors can be mitigated, but it requires additional configuration -to avoid triggering the local toolchain repository to initialize (i.e. run +to avoid triggering the local toolchain repository to initialize (i.e., run local commands and perform downloads). The two settings to change are {obj}`local_runtime_toolchains_repo.target_compatible_with` and {obj}`local_runtime_toolchains_repo.target_settings`, which control how Bazel decides if a toolchain should match. By default, they point to targets *within* -the local runtime repository (trigger repo initialization). We have to override +the local runtime repository (triggering repo initialization). We have to override them to *not* reference the local runtime repository at all. In the example below, we reconfigure the local toolchains so they are only activated if the custom flag `--//:py=local` is set and the target platform -matches the Bazel host platform. The net effect is CI won't use the local +matches the Bazel host platform. The net effect is that CI won't use the local toolchain (nor initialize its repository), and developers can easily enable/disable the local toolchain with a command line flag. @@ -545,9 +545,9 @@ information about Python at build time. In particular, this means it is not able to build C extensions -- doing so requires knowing, at build time, what Python headers to use. -In effect, all it does is generate a small wrapper script that simply calls e.g. +In effect, all it does is generate a small wrapper script that simply calls, e.g., `/usr/bin/env python3` to run a program. This makes it easy to change what -Python is used to run a program, but also makes it easy to use a Python version +Python is used to run a program but also makes it easy to use a Python version that isn't compatible with build-time assumptions. ``` @@ -565,26 +565,26 @@ locally installed Python. ### Autodetecting toolchain The autodetecting toolchain is a deprecated toolchain that is built into Bazel. -**It's name is a bit misleading: it doesn't autodetect anything**. All it does is +**Its name is a bit misleading: it doesn't autodetect anything.** All it does is use `python3` from the environment a binary runs within. This provides extremely limited functionality to the rules (at build time, nothing is knowable about the Python runtime). Bazel itself automatically registers `@bazel_tools//tools/python:autodetecting_toolchain` -as the lowest priority toolchain. For WORKSPACE builds, if no other toolchain -is registered, that toolchain will be used. For bzlmod builds, rules_python +as the lowest priority toolchain. For `WORKSPACE` builds, if no other toolchain +is registered, that toolchain will be used. For Bzlmod builds, `rules_python` automatically registers a higher-priority toolchain; it won't be used unless there is a toolchain misconfiguration somewhere. -To aid migration off the Bazel-builtin toolchain, rules_python provides +To aid migration off the Bazel-builtin toolchain, `rules_python` provides {bzl:obj}`@rules_python//python/runtime_env_toolchains:all`. This is an equivalent -toolchain, but is implemented using rules_python's objects. +toolchain but is implemented using `rules_python`'s objects. ## Custom toolchains -While rules_python provides toolchains by default, it is not required to use +While `rules_python` provides toolchains by default, it is not required to use them, and you can define your own toolchains to use instead. This section -gives an introduction for how to define them yourself. +gives an introduction to how to define them yourself. :::{note} * Defining your own toolchains is an advanced feature. @@ -599,7 +599,7 @@ toolchains a "toolchain suite". One of the underlying design goals of the toolchains is to support complex and bespoke environments. Such environments may use an arbitrary combination of {bzl:obj}`RBE`, cross-platform building, multiple Python versions, -building Python from source, embeding Python (as opposed to building separate +building Python from source, embedding Python (as opposed to building separate interpreters), using prebuilt binaries, or using binaries built from source. To that end, many of the attributes they accept, and fields they provide, are optional. @@ -610,7 +610,7 @@ The target toolchain type is {obj}`//python:toolchain_type`, and it is for _target configuration_ runtime information, e.g., the Python version and interpreter binary that a program will use. -The is typically implemented using {obj}`py_runtime()`, which +This is typically implemented using {obj}`py_runtime()`, which provides the {obj}`PyRuntimeInfo` provider. For historical reasons from the Python 2 transition, `py_runtime` is wrapped in {obj}`py_runtime_pair`, which provides {obj}`ToolchainInfo` with the field `py3_runtime`, which is an @@ -625,7 +625,7 @@ set {external:bzl:obj}`toolchain.exec_compatible_with`. ### Python C toolchain type The Python C toolchain type ("py cc") is {obj}`//python/cc:toolchain_type`, and -it has C/C++ information for the _target configuration_, e.g. the C headers that +it has C/C++ information for the _target configuration_, e.g., the C headers that provide `Python.h`. This is typically implemented using {obj}`py_cc_toolchain()`, which provides @@ -642,7 +642,7 @@ set {external:bzl:obj}`toolchain.exec_compatible_with`. ### Exec tools toolchain type The exec tools toolchain type is {obj}`//python:exec_tools_toolchain_type`, -and it is for supporting tools for _building_ programs, e.g. the binary to +and it is for supporting tools for _building_ programs, e.g., the binary to precompile code at build time. This toolchain type is intended to hold only _exec configuration_ values -- @@ -661,7 +661,7 @@ target configuration (e.g. Python version), then for one to be chosen based on finding one compatible with the available host platforms to run the tool on. However, what `target_compatible_with`/`target_settings` and -`exec_compatible_with` values to use depend on details of the tools being used. +`exec_compatible_with` values to use depends on the details of the tools being used. For example: * If you had a precompiler that supported any version of Python, then putting the Python version in `target_settings` is unnecessary. @@ -672,9 +672,9 @@ This can work because, when the rules invoke these build tools, they pass along all necessary information so that the tool can be entirely independent of the target configuration being built for. -Alternatively, if you had a precompiler that only ran on linux, and only -produced valid output for programs intended to run on linux, then _both_ -`exec_compatible_with` and `target_compatible_with` must be set to linux. +Alternatively, if you had a precompiler that only ran on Linux and only +produced valid output for programs intended to run on Linux, then _both_ +`exec_compatible_with` and `target_compatible_with` must be set to Linux. ### Custom toolchain example @@ -684,9 +684,9 @@ Here, we show an example for a semi-complicated toolchain suite, one that is: * For Python version 3.12.0 * Using an in-build interpreter built from source * That only runs on Linux -* Using a prebuilt precompiler that only runs on Linux, and only produces byte - code valid for 3.12 -* With the exec tools interpreter disabled (unnecessary with a prebuild +* Using a prebuilt precompiler that only runs on Linux and only produces + bytecode valid for 3.12 +* With the exec tools interpreter disabled (unnecessary with a prebuilt precompiler) * Providing C headers and libraries @@ -748,13 +748,13 @@ toolchain( name = "runtime_toolchain", toolchain = "//toolchain_impl:runtime_pair", toolchain_type = "@rules_python//python:toolchain_type", - target_compatible_with = ["@platforms/os:linux"] + target_compatible_with = ["@platforms/os:linux"], ) toolchain( name = "py_cc_toolchain", toolchain = "//toolchain_impl:py_cc_toolchain_impl", toolchain_type = "@rules_python//python/cc:toolchain_type", - target_compatible_with = ["@platforms/os:linux"] + target_compatible_with = ["@platforms/os:linux"], ) toolchain( @@ -764,19 +764,19 @@ toolchain( target_settings = [ "@rules_python//python/config_settings:is_python_3.12", ], - exec_comaptible_with = ["@platforms/os:linux"] + exec_compatible_with = ["@platforms/os:linux"], ) # ----------------------------------------------- # File: MODULE.bazel or WORKSPACE.bazel -# These toolchains will considered before others. +# These toolchains will be considered before others. # ----------------------------------------------- register_toolchains("//toolchains:all") ``` -When registering custom toolchains, be aware of the the [toolchain registration +When registering custom toolchains, be aware of the [toolchain registration order](https://bazel.build/extending/toolchains#toolchain-resolution). In brief, -toolchain order is the BFS-order of the modules; see the bazel docs for a more +toolchain order is the BFS-order of the modules; see the Bazel docs for a more detailed description. :::{note} @@ -796,7 +796,7 @@ Currently the following flags are used to influence toolchain selection: To run the interpreter that Bazel will use, you can use the `@rules_python//python/bin:python` target. This is a binary target with -the executable pointing at the `python3` binary plus its relevent runfiles. +the executable pointing at the `python3` binary plus its relevant runfiles. ```console $ bazel run @rules_python//python/bin:python @@ -838,7 +838,7 @@ targets on its own. Please file a feature request if this is desired. The `//python/bin:python` target provides access to the underlying interpreter without any hermeticity guarantees. -The [`//python/bin:repl` target](repl) provides an environment indentical to +The [`//python/bin:repl` target](repl) provides an environment identical to what `py_binary` provides. That means it handles things like the [`PYTHONSAFEPATH`](https://docs.python.org/3/using/cmdline.html#envvar-PYTHONSAFEPATH) environment variable automatically. The `//python/bin:python` target will not. From 1ddb5f62be3b5968293ec5c63adf0a3beb9fc8c1 Mon Sep 17 00:00:00 2001 From: Richard Levasseur Date: Thu, 19 Jun 2025 21:19:41 -0700 Subject: [PATCH 2/5] copy edit a bit --- docs/_includes/py_console_script_binary.md | 11 ++++++----- docs/pypi/circular-dependencies.md | 4 ++-- 2 files changed, 8 insertions(+), 7 deletions(-) diff --git a/docs/_includes/py_console_script_binary.md b/docs/_includes/py_console_script_binary.md index 08d931773d..cae9f9f2f5 100644 --- a/docs/_includes/py_console_script_binary.md +++ b/docs/_includes/py_console_script_binary.md @@ -2,7 +2,7 @@ This rule is to make it easier to generate `console_script` entry points as per Python [specification]. Generate a `py_binary` target for a particular `console_script` entry_point -from a PyPI package. For example, to create an executable `pylint` target, use: +from a PyPI package, e.g. for creating an executable `pylint` target, use: ```starlark load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_console_script_binary") @@ -14,9 +14,10 @@ py_console_script_binary( #### Specifying extra dependencies You can also specify extra dependencies and the -exact script name you want to call. This is useful for tools like `flake8`, `pylint`, -and `pytest`, which have plugin discovery methods and discover dependencies from the -PyPI packages available in the `PYTHONPATH`. +exact script name you want to call. This is useful for tools like `flake8`, +`pylint`, and `pytest`, which have plugin discovery methods and discover +dependencies from the PyPI packages available in the `PYTHONPATH`. + ```starlark load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_console_script_binary") @@ -75,7 +76,7 @@ For example, `load("@python_versions//3.11:defs.bzl", "py_binary")` and `load("@ You should instead specify the desired Python version with `python_version`; see the example above. ::: -Alternatively, the `py_console_script_binary.binary_rule` arg can be passed +Alternatively, the {obj}`py_console_script_binary.binary_rule` arg can be passed the version-bound `py_binary` symbol, or any other `py_binary`-compatible rule of your choosing: ```starlark diff --git a/docs/pypi/circular-dependencies.md b/docs/pypi/circular-dependencies.md index 797b880562..62613f489e 100644 --- a/docs/pypi/circular-dependencies.md +++ b/docs/pypi/circular-dependencies.md @@ -6,7 +6,7 @@ Sometimes PyPI packages contain dependency cycles. For instance, a particular version of `sphinx` (this is no longer the case in the latest version as of 2024-06-02) depends on `sphinxcontrib-serializinghtml`. When using them as -`requirement()`s, like so: +`requirement()`s, ala ```starlark py_binary( @@ -47,7 +47,7 @@ simultaneously. ) ``` -`pip_parse` supports fixing multiple cycles simultaneously; however, cycles must +`pip_parse` supports fixing multiple cycles simultaneously, however, cycles must be distinct. `apache-airflow`, for instance, has dependency cycles with a number of its optional dependencies, which means those optional dependencies must all be a part of the `airflow` cycle. For instance: From 067b1396d4f8ab3ec37dca312eeaaeb50a7c464c Mon Sep 17 00:00:00 2001 From: Richard Levasseur Date: Thu, 19 Jun 2025 21:50:54 -0700 Subject: [PATCH 3/5] more copy editing --- docs/coverage.md | 2 +- docs/environment-variables.md | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/coverage.md b/docs/coverage.md index 023173d396..3c7d9e0cfc 100644 --- a/docs/coverage.md +++ b/docs/coverage.md @@ -32,7 +32,7 @@ python_register_toolchains( This will implicitly add the version of `coverage` bundled with `rules_python` to the dependencies of `py_test` rules when `bazel coverage` is run. If a target already transitively depends on a different version of - `coverage`, then the behavior is undefined -- it is undefined which version comes +`coverage`, then the behavior is undefined -- it is undefined which version comes first in the import path. If you find yourself in this situation, then you'll need to manually configure coverage (see below). ::: diff --git a/docs/environment-variables.md b/docs/environment-variables.md index 8fa0a82321..9a8c1dfe99 100644 --- a/docs/environment-variables.md +++ b/docs/environment-variables.md @@ -38,13 +38,13 @@ stderr. When `1`, bzlmod extensions will print debug information about what they're doing. This is mostly useful for development to debug errors. -:::: +::: :::{envvar} RULES_PYTHON_DEPRECATION_WARNINGS When `1`, `rules_python` will warn users about deprecated functionality that will be removed in a subsequent major `rules_python` version. Defaults to `0` if unset. -:::: +::: ::::{envvar} RULES_PYTHON_ENABLE_PYSTAR @@ -131,7 +131,7 @@ Replace the `VERSION_OS_ARCH` part with actual values when using, e.g., `3_13_0_linux_x86_64`. The version values must have `_` instead of `.` and the os, arch values are the same as the ones mentioned in the `//python:versions.bzl` file. -:::: +::: :::{envvar} VERBOSE_COVERAGE From cbb7c9d7799d7e2a28affe36fef5a052640cb5bc Mon Sep 17 00:00:00 2001 From: Richard Levasseur Date: Thu, 19 Jun 2025 21:51:18 -0700 Subject: [PATCH 4/5] more copy editing --- docs/precompiling.md | 2 +- docs/pypi/lock.md | 11 ++++++++--- docs/pypi/patch.md | 2 +- docs/pypi/use.md | 2 +- 4 files changed, 11 insertions(+), 6 deletions(-) diff --git a/docs/precompiling.md b/docs/precompiling.md index 8fbee46c17..ea978cddce 100644 --- a/docs/precompiling.md +++ b/docs/precompiling.md @@ -13,7 +13,7 @@ While precompiling helps runtime performance, it has two main costs: 1. Increasing the size (count and disk usage) of runfiles. It approximately double the count of the runfiles because for every `.py` file, there is also a `.pyc` file. Compiled files are generally around the same size as the - source files, so it approximately doubles disk usage. + source files, so it approximately doubles the disk usage. 2. Precompiling requires running an extra action at build time. While compiling itself isn't that expensive, the overhead can become noticeable as more files need to be compiled. diff --git a/docs/pypi/lock.md b/docs/pypi/lock.md index ebb63e3b76..db557fe594 100644 --- a/docs/pypi/lock.md +++ b/docs/pypi/lock.md @@ -11,9 +11,14 @@ Currently `rules_python` only supports `requirements.txt` format. ### pip compile -Generally, when working on a Python project, you'll have some dependencies that themselves have other dependencies. You might also specify dependency bounds instead of specific versions. So you'll need to generate a full list of all transitive dependencies and pinned versions for every dependency. - -Typically, you'd have your project dependencies specified in `pyproject.toml` or `requirements.in` and generate the full pinned list of dependencies in `requirements_lock.txt`, which you can manage with {obj}`compile_pip_requirements`: +Generally, when working on a Python project, you'll have some dependencies that themselves have +other dependencies. You might also specify dependency bounds instead of specific versions. +So you'll need to generate a full list of all transitive dependencies and pinned versions +for every dependency. + +Typically, you'd have your project dependencies specified in `pyproject.toml` or `requirements.in` +and generate the full pinned list of dependencies in `requirements_lock.txt`, which you can +manage with {obj}`compile_pip_requirements`: ```starlark load("@rules_python//python:pip.bzl", "compile_pip_requirements") diff --git a/docs/pypi/patch.md b/docs/pypi/patch.md index 7ed2bb9c60..7e3cb41981 100644 --- a/docs/pypi/patch.md +++ b/docs/pypi/patch.md @@ -3,7 +3,7 @@ # Patching wheels -Sometimes, wheels have to be patched to: +Sometimes the wheels have to be patched to: * Workaround the lack of a standard `site-packages` layout ({gh-issue}`2156`). * Include certain PRs of your choice on top of wheels and avoid building from sdist. diff --git a/docs/pypi/use.md b/docs/pypi/use.md index a4bbd076bc..6212097f86 100644 --- a/docs/pypi/use.md +++ b/docs/pypi/use.md @@ -33,7 +33,7 @@ Note that the usage of the `requirement` helper is not advised and can be proble [notes below](#requirement-helper). Note that the hub repo contains the following targets for each package: -* `@pypi//numpy`, which is shorthand for `@pypi//numpy:numpy`. This is an {obj}`alias` to +* `@pypi//numpy` - shorthand for `@pypi//numpy:numpy`. This is an {obj}`alias` to `@pypi//numpy:pkg`. * `@pypi//numpy:pkg` - the {obj}`py_library` target automatically generated by the repository rules. From 3b7e832c91bb8e34700fd9fa5399eb54cc2125de Mon Sep 17 00:00:00 2001 From: Richard Levasseur Date: Thu, 19 Jun 2025 21:51:38 -0700 Subject: [PATCH 5/5] more copy editing --- docs/toolchains.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/toolchains.md b/docs/toolchains.md index 368c92e14b..de819cb515 100644 --- a/docs/toolchains.md +++ b/docs/toolchains.md @@ -17,11 +17,11 @@ you're using Python. There are four basic use cases: 2. A library module with dev-only uses of Python. For example, a Java project that only uses Python as part of testing itself. 3. A library module without version constraints. For example, a rule set with - Python build tools, but it defers to the user as to what Python version is used + Python build tools, but defers to the user as to what Python version is used for the tools. 4. A library module with version constraints. For example, a rule set with Python build tools, and the module requires a specific version of Python - to be used with its tools. + be used with its tools. ### Root modules