Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# rules_python Sphinx docs generation

The docs for rules_python are generated using a combination of Sphinx, Bazel,
and Readthedocs.org. The Markdown files in source control are unlikely to render
and Read the Docs. The Markdown files in source control are unlikely to render
properly without the Sphinx processing step because they rely on Sphinx and
MyST-specific Markdown functionality.

The actual sources that Sphinx consumes are in this directory, with Stardoc
generating additional sources or Sphinx.
generating additional sources for Sphinx.

Manually building the docs isn't necessary -- readthedocs.org will
Manually building the docs isn't necessary -- Read the Docs will
automatically build and deploy them when commits are pushed to the repo.

## Generating docs for development
Expand All @@ -31,8 +31,8 @@ equivalent bazel command if desired.
### Installing ibazel

The `ibazel` tool can be used to automatically rebuild the docs as you
development them. See the [ibazel docs](https://github.com/bazelbuild/bazel-watcher) for
how to install it. The quick start for linux is:
develop them. See the [ibazel docs](https://github.com/bazelbuild/bazel-watcher) for
how to install it. The quick start for Linux is:

```
sudo apt install npm
Expand All @@ -57,9 +57,9 @@ docs/.
The Sphinx configuration is `docs/conf.py`. See
https://www.sphinx-doc.org/ for details about the configuration file.

## Readthedocs configuration
## Read the Docs configuration

There's two basic parts to the readthedocs configuration:
There's two basic parts to the Read the Docs configuration:

* `.readthedocs.yaml`: This configuration file controls most settings, such as
the OS version used to build, Python version, dependencies, what Bazel
Expand All @@ -69,4 +69,4 @@ There's two basic parts to the readthedocs configuration:
controls additional settings such as permissions, what versions are
published, when to publish changes, etc.

For more readthedocs configuration details, see docs.readthedocs.io.
For more Read the Docs configuration details, see docs.readthedocs.io.
25 changes: 13 additions & 12 deletions docs/_includes/py_console_script_binary.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
This rule is to make it easier to generate `console_script` entry points
as per Python [specification].

Generate a `py_binary` target for a particular console_script `entry_point`
from a PyPI package, e.g. for creating an executable `pylint` target use:
Generate a `py_binary` target for a particular `console_script` entry_point
from a PyPI package, e.g. for creating an executable `pylint` target, use:
```starlark
load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_console_script_binary")

Expand All @@ -12,11 +12,12 @@ py_console_script_binary(
)
```

#### Specifying extra dependencies
#### Specifying extra dependencies
You can also specify extra dependencies and the
exact script name you want to call. It is useful for tools like `flake8`, `pylint`,
`pytest`, which have plugin discovery methods and discover dependencies from the
PyPI packages available in the `PYTHONPATH`.
exact script name you want to call. This is useful for tools like `flake8`,
`pylint`, and `pytest`, which have plugin discovery methods and discover
dependencies from the PyPI packages available in the `PYTHONPATH`.

```starlark
load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_console_script_binary")

Expand Down Expand Up @@ -44,13 +45,13 @@ load("@rules_python//python/entry_points:py_console_script_binary.bzl", "py_cons
py_console_script_binary(
name = "yamllint",
pkg = "@pip//yamllint",
python_version = "3.9"
python_version = "3.9",
)
```

#### Adding a Shebang Line

You can specify a shebang line for the generated binary, useful for Unix-like
You can specify a shebang line for the generated binary. This is useful for Unix-like
systems where the shebang line determines which interpreter is used to execute
the script, per [PEP441]:

Expand All @@ -70,12 +71,12 @@ Python interpreter is available in the environment.

#### Using a specific Python Version directly from a Toolchain
:::{deprecated} 1.1.0
The toolchain specific `py_binary` and `py_test` symbols are aliases to the regular rules.
i.e. Deprecated `load("@python_versions//3.11:defs.bzl", "py_binary")` and `load("@python_versions//3.11:defs.bzl", "py_test")`
The toolchain-specific `py_binary` and `py_test` symbols are aliases to the regular rules.
For example, `load("@python_versions//3.11:defs.bzl", "py_binary")` and `load("@python_versions//3.11:defs.bzl", "py_test")` are deprecated.

You should instead specify the desired python version with `python_version`; see above example.
You should instead specify the desired Python version with `python_version`; see the example above.
:::
Alternatively, the [`py_console_script_binary.binary_rule`] arg can be passed
Alternatively, the {obj}`py_console_script_binary.binary_rule` arg can be passed
the version-bound `py_binary` symbol, or any other `py_binary`-compatible rule
of your choosing:
```starlark
Expand Down
4 changes: 2 additions & 2 deletions docs/coverage.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ when configuring toolchains.
## Enabling `rules_python` coverage support

Enabling the coverage support bundled with `rules_python` just requires setting an
argument when registerting toolchains.
argument when registering toolchains.

For Bzlmod:

Expand All @@ -32,7 +32,7 @@ python_register_toolchains(
This will implicitly add the version of `coverage` bundled with
`rules_python` to the dependencies of `py_test` rules when `bazel coverage` is
run. If a target already transitively depends on a different version of
`coverage`, then behavior is undefined -- it is undefined which version comes
`coverage`, then the behavior is undefined -- it is undefined which version comes
first in the import path. If you find yourself in this situation, then you'll
need to manually configure coverage (see below).
:::
Expand Down
28 changes: 14 additions & 14 deletions docs/devguide.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Dev Guide

This document covers tips and guidance for working on the rules_python code
base. A primary audience for it is first time contributors.
This document covers tips and guidance for working on the `rules_python` code
base. Its primary audience is first-time contributors.

## Running tests

Expand All @@ -12,8 +12,8 @@ bazel test //...
```

And it will run all the tests it can find. The first time you do this, it will
probably take long time because various dependencies will need to be downloaded
and setup. Subsequent runs will be faster, but there are many tests, and some of
probably take a long time because various dependencies will need to be downloaded
and set up. Subsequent runs will be faster, but there are many tests, and some of
them are slow. If you're working on a particular area of code, you can run just
the tests in those directories instead, which can speed up your edit-run cycle.

Expand All @@ -22,14 +22,14 @@ the tests in those directories instead, which can speed up your edit-run cycle.
Most code should have tests of some sort. This helps us have confidence that
refactors didn't break anything and that releases won't have regressions.

We don't require 100% test coverage, testing certain Bazel functionality is
We don't require 100% test coverage; testing certain Bazel functionality is
difficult, and some edge cases are simply too hard to test or not worth the
extra complexity. We try to judiciously decide when not having tests is a good
idea.

Tests go under `tests/`. They are loosely organized into directories for the
particular subsystem or functionality they are testing. If an existing directory
doesn't seem like a good match for the functionality being testing, then it's
doesn't seem like a good match for the functionality being tested, then it's
fine to create a new directory.

Re-usable test helpers and support code go in `tests/support`. Tests don't need
Expand Down Expand Up @@ -72,9 +72,9 @@ the rule. To have it support setting a new flag:

An integration test is one that runs a separate Bazel instance inside the test.
These tests are discouraged unless absolutely necessary because they are slow,
require much memory and CPU, and are generally harder to debug. Integration
tests are reserved for things that simple can't be tested otherwise, or for
simple high level verification tests.
require a lot of memory and CPU, and are generally harder to debug. Integration
tests are reserved for things that simply can't be tested otherwise, or for
simple high-level verification tests.

Integration tests live in `tests/integration`. When possible, add to an existing
integration test.
Expand All @@ -98,9 +98,9 @@ integration test.

## Updating tool dependencies

It's suggested to routinely update the tool versions within our repo - some of the
tools are using requirement files compiled by `uv` and others use other means. In order
to have everything self-documented, we have a special target -
`//private:requirements.update`, which uses `rules_multirun` to run in sequence all
of the requirement updating scripts in one go. This can be done once per release as
It's suggested to routinely update the tool versions within our repo. Some of the
tools are using requirement files compiled by `uv`, and others use other means. In order
to have everything self-documented, we have a special target,
`//private:requirements.update`, which uses `rules_multirun` to run all
of the requirement-updating scripts in sequence in one go. This can be done once per release as
we prepare for releases.
24 changes: 12 additions & 12 deletions docs/environment-variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,16 @@
This variable allows for additional arguments to be provided to the Python interpreter
at bootstrap time when the `bash` bootstrap is used. If
`RULES_PYTHON_ADDITIONAL_INTERPRETER_ARGS` were provided as `-Xaaa`, then the command
would be;
would be:

```
python -Xaaa /path/to/file.py
```

This feature is likely to be useful for the integration of debuggers. For example,
it would be possible to configure the `RULES_PYTHON_ADDITIONAL_INTERPRETER_ARGS` to
be set to `/path/to/debugger.py --port 12344 --file` resulting
in the command executed being;
it would be possible to configure `RULES_PYTHON_ADDITIONAL_INTERPRETER_ARGS` to
be set to `/path/to/debugger.py --port 12344 --file`, resulting
in the command executed being:

```
python /path/to/debugger.py --port 12345 --file /path/to/file.py
Expand Down Expand Up @@ -42,14 +42,14 @@ doing. This is mostly useful for development to debug errors.

:::{envvar} RULES_PYTHON_DEPRECATION_WARNINGS

When `1`, the rules_python will warn users about deprecated functionality that will
When `1`, `rules_python` will warn users about deprecated functionality that will
be removed in a subsequent major `rules_python` version. Defaults to `0` if unset.
:::

::::{envvar} RULES_PYTHON_ENABLE_PYSTAR

When `1`, the rules_python Starlark implementation of the core rules is used
instead of the Bazel-builtin rules. Note this requires Bazel 7+. Defaults
When `1`, the `rules_python` Starlark implementation of the core rules is used
instead of the Bazel-builtin rules. Note that this requires Bazel 7+. Defaults
to `1`.

:::{versionadded} 0.26.0
Expand All @@ -62,7 +62,7 @@ The default became `1` if unspecified

::::{envvar} RULES_PYTHON_ENABLE_PIPSTAR

When `1`, the rules_python Starlark implementation of the pypi/pip integration is used
When `1`, the `rules_python` Starlark implementation of the PyPI/pip integration is used
instead of the legacy Python scripts.

:::{versionadded} 1.5.0
Expand Down Expand Up @@ -95,8 +95,8 @@ exit.

:::{envvar} RULES_PYTHON_GAZELLE_VERBOSE

When `1`, debug information from gazelle is printed to stderr.
:::
When `1`, debug information from Gazelle is printed to stderr.
::::

:::{envvar} RULES_PYTHON_PIP_ISOLATED

Expand Down Expand Up @@ -125,9 +125,9 @@ Determines the verbosity of logging output for repo rules. Valid values:

:::{envvar} RULES_PYTHON_REPO_TOOLCHAIN_VERSION_OS_ARCH

Determines the python interpreter platform to be used for a particular
Determines the Python interpreter platform to be used for a particular
interpreter `(version, os, arch)` triple to be used in repository rules.
Replace the `VERSION_OS_ARCH` part with actual values when using, e.g.
Replace the `VERSION_OS_ARCH` part with actual values when using, e.g.,
`3_13_0_linux_x86_64`. The version values must have `_` instead of `.` and the
os, arch values are the same as the ones mentioned in the
`//python:versions.bzl` file.
Expand Down
12 changes: 6 additions & 6 deletions docs/extending.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,10 +41,10 @@ wrappers around the keyword arguments eventually passed to the `rule()`
function. These builder APIs give access to the _entire_ rule definition and
allow arbitrary modifications.

This is level of control is powerful, but also volatile. A rule definition
This level of control is powerful but also volatile. A rule definition
contains many details that _must_ change as the implementation changes. What
is more or less likely to change isn't known in advance, but some general
rules are:
rules of thumb are:

* Additive behavior to public attributes will be less prone to breaking.
* Internal attributes that directly support a public attribute are likely
Expand All @@ -55,7 +55,7 @@ rules are:

## Example: validating a source file

In this example, we derive from `py_library` a custom rule that verifies source
In this example, we derive a custom rule from `py_library` that verifies source
code contains the word "snakes". It does this by:

* Adding an implicit dependency on a checker program
Expand Down Expand Up @@ -111,7 +111,7 @@ has_snakes_library = create_has_snakes_rule()

## Example: adding transitions

In this example, we derive from `py_binary` to force building for a particular
In this example, we derive a custom rule from `py_binary` to force building for a particular
platform. We do this by:

* Adding an additional output to the rule's cfg
Expand All @@ -136,8 +136,8 @@ def create_rule():
r.cfg.add_output("//command_line_option:platforms")
return r.build()

py_linux_binary = create_linux_binary_rule()
py_linux_binary = create_rule()
```

Users can then use `py_linux_binary` the same as a regular py_binary. It will
Users can then use `py_linux_binary` the same as a regular `py_binary`. It will
act as if `--platforms=//my/platforms:linux` was specified when building it.
4 changes: 2 additions & 2 deletions docs/gazelle.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
[Gazelle](https://github.com/bazelbuild/bazel-gazelle)
is a build file generator for Bazel projects. It can create new `BUILD.bazel` files for a project that follows language conventions and update existing build files to include new sources, dependencies, and options.

Bazel may run Gazelle using the Gazelle rule, or it may be installed and run as a command line tool.
Bazel may run Gazelle using the Gazelle rule, or Gazelle may be installed and run as a command line tool.

See the documentation for Gazelle with rules_python in the {gh-path}`gazelle`
See the documentation for Gazelle with `rules_python` in the {gh-path}`gazelle`
directory.
10 changes: 5 additions & 5 deletions docs/getting-started.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# Getting started

This doc is a simplified guide to help get started quickly. It provides
This document is a simplified guide to help you get started quickly. It provides
a simplified introduction to having a working Python program for both `bzlmod`
and the older way of using `WORKSPACE`.

It assumes you have a `requirements.txt` file with your PyPI dependencies.

For more details information about configuring `rules_python`, see:
For more detailed information about configuring `rules_python`, see:
* [Configuring the runtime](configuring-toolchains)
* [Configuring third party dependencies (pip/pypi)](./pypi/index)
* [Configuring third-party dependencies (pip/PyPI)](./pypi/index)
* [API docs](api/index)

## Including dependencies
Expand All @@ -32,7 +32,7 @@ use_repo(pip, "pypi")

### Using a WORKSPACE file

Using WORKSPACE is deprecated, but still supported, and a bit more involved than
Using `WORKSPACE` is deprecated but still supported, and it's a bit more involved than
using Bzlmod. Here is a simplified setup to download the prebuilt runtimes.

```starlark
Expand Down Expand Up @@ -72,7 +72,7 @@ pip_parse(

## "Hello World"

Once you've imported the rule set using either Bzlmod or WORKSPACE, you can then
Once you've imported the rule set using either Bzlmod or `WORKSPACE`, you can then
load the core rules in your `BUILD` files with the following:

```starlark
Expand Down
10 changes: 5 additions & 5 deletions docs/glossary.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
common attributes
: Every rule has a set of common attributes. See Bazel's
[Common attributes](https://bazel.build/reference/be/common-definitions#common-attributes)
for a complete listing
for a complete listing.

in-build runtime
: An in-build runtime is one where the Python runtime, and all its files, are
Expand All @@ -21,9 +21,9 @@ which can be a significant number of files.

platform runtime
: A platform runtime is a Python runtime that is assumed to be installed on the
system where a Python binary runs, whereever that may be. For example, using `/usr/bin/python3`
system where a Python binary runs, wherever that may be. For example, using `/usr/bin/python3`
as the interpreter is a platform runtime -- it assumes that, wherever the binary
runs (your local machine, a remote worker, within a container, etc), that path
runs (your local machine, a remote worker, within a container, etc.), that path
is available. Such runtimes are _not_ part of a binary's runfiles.

The main advantage of platform runtimes is they are lightweight insofar as
Expand All @@ -42,8 +42,8 @@ rule callable
accepted; refer to the respective API accepting this type.

simple label
: A `str` or `Label` object but not a _direct_ `select` object. These usually
mean a string manipulation is occuring, which can't be done on `select`
A `str` or `Label` object but not a _direct_ `select` object. This usually
means a string manipulation is occurring, which can't be done on `select`
objects. Such attributes are usually still configurable if an alias is used,
and a reference to the alias is passed instead.

Expand Down
Loading