diff --git a/tools/third_party/attrs/.git_archival.txt b/tools/third_party/attrs/.git_archival.txt index 8fb235d7045be0..7c5100942aae48 100644 --- a/tools/third_party/attrs/.git_archival.txt +++ b/tools/third_party/attrs/.git_archival.txt @@ -1,4 +1,3 @@ node: $Format:%H$ node-date: $Format:%cI$ describe-name: $Format:%(describe:tags=true,match=*[0-9]*)$ -ref-names: $Format:%D$ diff --git a/tools/third_party/attrs/.github/CODE_OF_CONDUCT.md b/tools/third_party/attrs/.github/CODE_OF_CONDUCT.md index 1d8ad1833e7e99..348412b12b7ba5 100644 --- a/tools/third_party/attrs/.github/CODE_OF_CONDUCT.md +++ b/tools/third_party/attrs/.github/CODE_OF_CONDUCT.md @@ -6,7 +6,7 @@ We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender -identity and expression, level of experience, education, socio-economic status, +identity and expression, level of experience, education, socioeconomic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation. diff --git a/tools/third_party/attrs/.github/CONTRIBUTING.md b/tools/third_party/attrs/.github/CONTRIBUTING.md index a7e5b014d33886..ce0079a5a364a0 100644 --- a/tools/third_party/attrs/.github/CONTRIBUTING.md +++ b/tools/third_party/attrs/.github/CONTRIBUTING.md @@ -1,14 +1,8 @@ # How To Contribute -Thank you for considering contributing to *attrs*! -It's people like *you* who make it such a great tool for everyone. - -This document intends to make contribution more accessible by codifying tribal knowledge and expectations. -Don't be afraid to open half-finished PRs, and ask questions if something is unclear! - -Please note that this project is released with a Contributor [Code of Conduct](https://github.com/python-attrs/attrs/blob/main/.github/CODE_OF_CONDUCT.md). -By participating in this project you agree to abide by its terms. -Please report any harm to [Hynek Schlawack] in any way you find appropriate. +> [!IMPORTANT] +> This document is mainly to help you to get started by codifying tribal knowledge and expectations and make it more accessible to everyone. +> But don't be afraid to open half-finished PRs and ask questions if something is unclear! ## Support @@ -21,175 +15,118 @@ The official tag is `python-attrs` and helping out in support frees us up to imp ## Workflow +First off, thank you for considering to contribute! +It's people like *you* who make this project such a great tool for everyone. + - No contribution is too small! Please submit as many fixes for typos and grammar bloopers as you can! + - Try to limit each pull request to *one* change only. + - Since we squash on merge, it's up to you how you handle updates to the `main` branch. Whether you prefer to rebase on `main` or merge `main` into your branch, do whatever is more comfortable for you. + + Just remember to [not use your own `main` branch for the pull request](https://hynek.me/articles/pull-requests-branch/). + - *Always* add tests and docs for your code. This is a hard rule; patches with missing tests or documentation won't be merged. -- Make sure your changes pass our [CI]. + +- Consider adding a news fragment to [`changelog.d`](../changelog.d/) to reflect the changes as observed by people *using* this library. + +- Make sure your changes pass our [CI](https://github.com/python-attrs/attrs/actions). You won't get any feedback until it's green unless you ask for it. -- For the CI to pass, the coverage must be 100%. + + For the CI to pass, the coverage must be 100%. If you have problems to test something, open anyway and ask for advice. In some situations, we may agree to add an `# pragma: no cover`. + - Once you've addressed review feedback, make sure to bump the pull request with a short note, so we know you're done. -- Don’t break backwards-compatibility. + +- Don't break [backwards-compatibility](SECURITY.md). ## Local Development Environment -You can (and should) run our test suite using [*tox*]. -However, you’ll probably want a more traditional environment as well. +First, **fork** the repository on GitHub and **clone** it using one of the alternatives that you can copy-paste by pressing the big green button labeled `<> Code`. + +You can (and should) run our test suite using [*tox*](https://tox.wiki/). +However, you'll probably want a more traditional environment as well. -First, create a [virtual environment](https://virtualenv.pypa.io/) so you don't break your system-wide Python installation. -We recommend using the Python version from the `.python-version-default` file in project's root directory. +We recommend using the Python version from the `.python-version-default` file in the project's root directory, because that's the one that is used in the CI by default, too. -If you're using [*direnv*](https://direnv.net), you can automate the creation of a virtual environment with the correct Python version by adding the following `.envrc` to the project root after you've cloned it to your computer: +If you're using [*direnv*](https://direnv.net), you can automate the creation of the project virtual environment with the correct Python version by adding the following `.envrc` to the project root: ```bash layout python python$(cat .python-version-default) ``` -If you're using tools that understand `.python-version` files like [*pyenv*](https://github.com/pyenv/pyenv) does, you can make it a link to the `.python-version-default` file. - ---- - -Then, [fork](https://github.com/python-attrs/attrs/fork) the repository on GitHub. - -Clone the fork to your computer: - -```console -$ git clone git@github.com:/attrs.git -``` - -Or if you prefer to use Git via HTTPS: - -```console -$ git clone https://github.com//attrs.git -``` - -Then add the *attrs* repository as *upstream* remote: - -```console -$ git remote add -t main -m main --tags upstream https://github.com/python-attrs/attrs.git -``` - -The next step is to sync your local copy with the upstream repository: +or, if you like [*uv*](https://github.com/astral-sh/uv): -```console -$ git fetch upstream +```bash +test -d .venv || uv venv --python python$(cat .python-version-default) +. .venv/bin/activate ``` -This is important to obtain eventually missing tags, which are needed to install the development version later on. -See [#1104](https://github.com/python-attrs/attrs/issues/1104) for more information. +> [!WARNING] +> - **Before** you start working on a new pull request, use the "*Sync fork*" button in GitHub's web UI to ensure your fork is up to date. +> +> - **Always create a new branch off `main` for each new pull request.** +> Yes, you can work on `main` in your fork and submit pull requests. +> But this will *inevitably* lead to you not being able to synchronize your fork with upstream and having to start over. -Change into the newly created directory and after activating a virtual environment install an editable version of *attrs* along with its tests and docs requirements: +Change into the newly created directory and after activating a virtual environment, install an editable version of this project along with its tests requirements: ```console -$ cd attrs -$ python -m pip install --upgrade pip wheel # PLEASE don't skip this step -$ python -m pip install -e '.[dev]' +$ pip install -e .[dev] # or `uv pip install -e .[dev]` ``` -At this point, +Now you can run the test suite: ```console -$ python -m pytest +$ python -Im pytest ``` -should work and pass. You can *significantly* speed up the test suite by passing `-n auto` to *pytest* which activates [*pytest-xdist*](https://github.com/pytest-dev/pytest-xdist) and takes advantage of all your CPU cores. -For documentation, you can use: - -```console -$ tox run -e docs-serve -``` - -This will build the documentation, and then watch for changes and rebuild it whenever you save a file. - -To just build the documentation and run doctests, use: - -```console -$ tox run -e docs -``` - -You will find the built documentation in `docs/_build/html`. - - --- -To file a pull request, create a new branch on top of the upstream repository's `main` branch: - -```console -$ git fetch upstream -$ git checkout -b my_topical_branch upstream/main -``` - -Make your changes, push them to your fork (the remote *origin*): +When working on the documentation, use: ```console -$ git push -u origin +$ tox run -e docs-watch ``` -and publish the PR in GitHub's web interface! +This will build the documentation, watch for changes, and rebuild it whenever you save a file. -After your pull request is merged and the branch is no longer needed, delete it: +To just build the documentation and exit immediately use: ```console -$ git checkout main -$ git push --delete origin my_topical_branch && git branch -D my_topical_branch +$ tox run -e docs-build ``` -Before starting to work on your next pull request, run the following command to sync your local repository with the remote *upstream*: - -```console -$ git fetch upstream -u main:main -``` - ---- - -To avoid committing code that violates our style guide, we strongly advise you to install [*pre-commit*] and its hooks: - -```console -$ pre-commit install -``` +You will find the built documentation in `docs/_build/html`. -This is not strictly necessary, because our [*tox*] file contains an environment that runs: +To run doctests: ```console -$ pre-commit run --all-files +$ tox run -e docs-doctests ``` -and our CI has integration with [pre-commit.ci](https://pre-commit.ci). -But it's way more comfortable to run it locally and *git* catching avoidable errors. - ## Code -- Obey [PEP 8](https://peps.python.org/pep-0008/) and [PEP 257](https://peps.python.org/pep-0257/). - We use the `"""`-on-separate-lines style for docstrings: +- We follow [PEP 8](https://peps.python.org/pep-0008/) as enforced by [Ruff](https://ruff.rs/) with a line length of 79 characters. - ```python - def func(x): - """ - Do something. +- As long as you run our full *tox* suite before committing, or install our [*pre-commit*](https://pre-commit.com/) hooks, you won't have to spend any time on formatting your code at all. + If you don't, CI will catch it for you -- but that seems like a waste of your time! - :param str x: A very important parameter. +- If you've changed or added public APIs, please update our type stubs (files ending in `.pyi`). - :rtype: str - """ - ``` -- If you add or change public APIs, tag the docstring using `.. versionadded:: 16.0.0 WHAT` or `.. versionchanged:: 16.2.0 WHAT`. -- We use [Ruff](https://github.com/astral-sh/ruff) to sort our imports, and we use [Black](https://github.com/psf/black) with line length of 79 characters to format our code. - As long as you run our full [*tox*] suite before committing, or install our [*pre-commit*] hooks (ideally you'll do both – see [*Local Development Environment*](#local-development-environment) above), you won't have to spend any time on formatting your code at all. - If you don't, [CI] will catch it for you – but that seems like a waste of your time! ## Tests -- Write your asserts as `expected == actual` to line them up nicely: +- Write your asserts as `expected == actual` to line them up nicely, and leave an empty line before them: ```python x = f() @@ -198,69 +135,98 @@ But it's way more comfortable to run it locally and *git* catching avoidable err assert "foo" == x._a_private_attribute ``` -- To run the test suite, all you need is a recent [*tox*]. - It will ensure the test suite runs with all dependencies against all Python versions just as it will in our [CI]. - If you lack some Python versions, you can can always limit the environments like `tox run -e py38,py39`, or make it a non-failure using `tox run --skip-missing-interpreters`. +- You can run the test suite runs with all dependencies against all supported Python versions -- just as it will in our CI -- by running `tox`. - In that case you should look into [*asdf*](https://asdf-vm.com) or [*pyenv*](https://github.com/pyenv/pyenv), which make it very easy to install many different Python versions in parallel. -- Write [good test docstrings](https://jml.io/pages/test-docstrings.html). -- To ensure new features work well with the rest of the system, they should be also added to our [*Hypothesis*](https://hypothesis.readthedocs.io/) testing strategy, which can be found in `tests/strategies.py`. -- If you've changed or added public APIs, please update our type stubs (files ending in `.pyi`). +- Write [good test docstrings](https://jml.io/test-docstrings/). + +- To ensure new features work well with the rest of the system, they should be also added to our [Hypothesis](https://hypothesis.readthedocs.io/) testing strategy, which can be found in `tests/strategies.py`. ## Documentation -- Use [semantic newlines] in [reStructuredText](https://www.sphinx-doc.org/en/stable/usage/restructuredtext/basics.html) and [Markdown](https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax) files (files ending in `.rst` and `.md`): +- Use [semantic newlines] in [reStructuredText](https://www.sphinx-doc.org/en/stable/usage/restructuredtext/basics.html) (`*.rst`) and [Markdown](https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax) (`*.md`) files: - ```rst + ```markdown This is a sentence. This is another sentence. + + This is a new paragraph. ``` -- If you start a new section, add two blank lines before and one blank line after the header, except if two headers follow immediately after each other: +- If you start a new section, add two blank lines before and one blank line after the header except if two headers follow immediately after each other: + + ```markdown + # Main Header - ```rst Last line of previous section. - Header of New Top Section - ------------------------- + ## Header of New Top Section - Header of New Section - ^^^^^^^^^^^^^^^^^^^^^ + ### Header of New Section First line of new section. ``` - If you add a new feature, demonstrate its awesomeness on the [examples page](https://github.com/python-attrs/attrs/blob/main/docs/examples.md)! +- For docstrings, we follow [PEP 257](https://peps.python.org/pep-0257/), use the `"""`-on-separate-lines style, and [Napoleon](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html)-style API documentation: + + ```python + def func(x: str, y: int) -> str: + """ + Do something. + + Args: + x: A very important argument. + + y: + Another very important argument, but its description is so long + that it doesn't fit on one line. So, we start the whole block on a + fresh new line to keep the block together. + + Returns: + str: The result of doing something. + + Raises: + ValueError: When an invalid value is passed. + """ + ``` + + Please note that the API docstrings are still reStructuredText. + +- If you add or change public APIs, tag the docstring using `.. versionadded:: 24.1.0 WHAT` or `.. versionchanged:: 24.1.0 WHAT`. + We follow CalVer, so the next version will be the current with with the middle number incremented (for example, `24.1.0` -> `24.2.0`). + ### Changelog -If your change is noteworthy, there needs to be a changelog entry so our users can learn about it! +If your change is interesting to end-users, there needs to be a changelog entry so they can learn about it! -To avoid merge conflicts, we use the [*Towncrier*](https://pypi.org/project/towncrier) package to manage our changelog. -*towncrier* uses independent *Markdown* files for each pull request – so called *news fragments* – instead of one monolithic changelog file. -On release, those news fragments are compiled into our [`CHANGELOG.md`](https://github.com/python-attrs/attrs/blob/main/CHANGELOG.md). +To avoid merge conflicts, we use the [Towncrier](https://pypi.org/project/towncrier) package to manage our changelog. +*towncrier* uses independent Markdown files for each pull request -- so called *news fragments* -- instead of one monolithic changelog file. +On release, those news fragments are compiled into our [`CHANGELOG.md`](../CHANGELOG.md). -You don't need to install *Towncrier* yourself, you just have to abide by a few simple rules: +You don't need to install Towncrier yourself, you just have to abide by a few simple rules: - For each pull request, add a new file into `changelog.d` with a filename adhering to the `pr#.(change|deprecation|breaking).md` schema: For example, `changelog.d/42.change.md` for a non-breaking change that is proposed in pull request #42. + - As with other docs, please use [semantic newlines] within news fragments. -- Wrap symbols like modules, functions, or classes into backticks so they are rendered in a `monospace font`. -- Wrap arguments into asterisks like in docstrings: + +- Refer to all symbols by their fully-qualified names. + For example, `attrs.Foo` -- not just `Foo`. + +- Wrap symbols like modules, functions, or classes into backticks, so they are rendered in a `monospace font`. + +- Wrap arguments into asterisks so they are *italicized* like in API documentation: `Added new argument *an_argument*.` -- If you mention functions or other callables, add parentheses at the end of their names: + +- If you mention functions or methods, add parentheses at the end of their names: `attrs.func()` or `attrs.Class.method()`. This makes the changelog a lot more readable. -- Prefer simple past tense or constructions with "now". - For example: - + Added `attrs.validators.func()`. - + `attrs.func()` now doesn't crash the Large Hadron Collider anymore when passed the *foobar* argument. -- If you want to reference multiple issues, copy the news fragment to another filename. - *Towncrier* will merge all news fragments with identical contents into one entry with multiple links to the respective pull requests. +- Prefer simple past tense or constructions with "now". Example entries: @@ -278,6 +244,10 @@ or: --- +If you want to reference multiple issues, copy the news fragment to another filename. +Towncrier will merge all news fragments with identical contents into one entry with multiple links to the respective pull requests. + + `tox run -e changelog` will render the current changelog to the terminal if you have any doubts. @@ -288,11 +258,16 @@ If you'd like to join, just get a pull request merged and ask to be added in the **The simple rule is that everyone is welcome to review/merge pull requests of others but nobody is allowed to merge their own code.** -[Hynek Schlawack] acts reluctantly as the [BDFL](https://en.wikipedia.org/wiki/Benevolent_dictator_for_life) and has the final say over design decisions. +[Hynek Schlawack](https://hynek.me/about/) acts reluctantly as the [BDFL](https://en.wikipedia.org/wiki/Benevolent_dictator_for_life) and has the final say over design decisions. + + +## See You on GitHub! +Again, this whole file is mainly to help you to get started by codifying tribal knowledge and expectations to save you time and turnarounds. +It is **not** meant to be a barrier to entry, so don't be afraid to open half-finished PRs and ask questions if something is unclear! + +Please note that this project is released with a Contributor [Code of Conduct](CODE_OF_CONDUCT.md). +By participating in this project you agree to abide by its terms. +Please report any harm to Hynek Schlawack in any way you find appropriate. -[CI]: https://github.com/python-attrs/attrs/actions?query=workflow%3ACI -[Hynek Schlawack]: https://hynek.me/about/ -[*pre-commit*]: https://pre-commit.com/ -[*tox*]: https://tox.wiki/ [semantic newlines]: https://rhodesmill.org/brandon/2012/one-sentence-per-line/ diff --git a/tools/third_party/attrs/.github/PULL_REQUEST_TEMPLATE.md b/tools/third_party/attrs/.github/PULL_REQUEST_TEMPLATE.md index e84b6c86ac84fd..c9bf6574cc463c 100644 --- a/tools/third_party/attrs/.github/PULL_REQUEST_TEMPLATE.md +++ b/tools/third_party/attrs/.github/PULL_REQUEST_TEMPLATE.md @@ -16,7 +16,7 @@ If your pull request is a documentation fix or a trivial typo, feel free to dele - [ ] Do **not** open pull requests from your `main` branch – **use a separate branch**! - There's a ton of footguns waiting if you don't heed this warning. You can still go back to your project, create a branch from your main branch, push it, and open the pull request from the new branch. - - This is not a pre-requisite for your your pull request to be accepted, but **you have been warned**. + - This is not a pre-requisite for your pull request to be accepted, but **you have been warned**. - [ ] Added **tests** for changed code. Our CI fails if coverage is not 100%. - [ ] New features have been added to our [Hypothesis testing strategy](https://github.com/python-attrs/attrs/blob/main/tests/strategies.py). @@ -25,12 +25,13 @@ If your pull request is a documentation fix or a trivial typo, feel free to dele - [ ] If they've been added to `attr/__init__.pyi`, they've *also* been re-imported in `attrs/__init__.pyi`. - [ ] Updated **documentation** for changed code. - [ ] New functions/classes have to be added to `docs/api.rst` by hand. - - [ ] Changes to the signature of `@attr.s()` have to be added by hand too. + - [ ] Changes to the signatures of `@attr.s()` and `@attrs.define()` have to be added by hand too. - [ ] Changed/added classes/methods/functions have appropriate `versionadded`, `versionchanged`, or `deprecated` [directives](http://www.sphinx-doc.org/en/stable/markup/para.html#directive-versionadded). The next version is the second number in the current release + 1. The first number represents the current year. So if the current version on PyPI is 22.2.0, the next version is gonna be 22.3.0. If the next version is the first in the new year, it'll be 23.1.0. + - [ ] If something changed that affects both `attrs.define()` and `attr.s()`, you have to add version directives to both. - [ ] Documentation in `.rst` and `.md` files is written using [semantic newlines](https://rhodesmill.org/brandon/2012/one-sentence-per-line/). - [ ] Changes (and possible deprecations) have news fragments in [`changelog.d`](https://github.com/python-attrs/attrs/blob/main/changelog.d). - [ ] Consider granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork), so maintainers can fix minor issues themselves without pestering you. diff --git a/tools/third_party/attrs/.github/SECURITY.md b/tools/third_party/attrs/.github/SECURITY.md index 1b8e14cf1efbe8..039df0790a835e 100644 --- a/tools/third_party/attrs/.github/SECURITY.md +++ b/tools/third_party/attrs/.github/SECURITY.md @@ -2,13 +2,13 @@ ## Supported Versions -We are following [*CalVer*](https://calver.org) with generous backwards-compatibility guarantees. +We are following [Calendar Versioning](https://calver.org) with generous backwards-compatibility guarantees. Therefore we only support the latest version. Put simply, you shouldn't ever be afraid to upgrade as long as you're only using our public APIs. Whenever there is a need to break compatibility, it is announced in the changelog, and raises a `DeprecationWarning` for a year (if possible) before it's finally really broken. -> **Warning** +> [!WARNING] > The structure of the `attrs.Attribute` class is exempt from this rule. > It *will* change in the future, but since it should be considered read-only, that shouldn't matter. > diff --git a/tools/third_party/attrs/.github/workflows/build-docset.yml b/tools/third_party/attrs/.github/workflows/build-docset.yml index ec0230d080c793..22e02751016ffe 100644 --- a/tools/third_party/attrs/.github/workflows/build-docset.yml +++ b/tools/third_party/attrs/.github/workflows/build-docset.yml @@ -10,8 +10,8 @@ env: PIP_DISABLE_PIP_VERSION_CHECK: "1" PIP_NO_PYTHON_VERSION_WARNING: "1" -permissions: - contents: read +permissions: {} + jobs: docset: @@ -20,15 +20,15 @@ jobs: - uses: actions/checkout@v4 with: fetch-depth: 0 - - uses: actions/setup-python@v4 + persist-credentials: false + - uses: actions/setup-python@v5 with: python-version: "3.x" + - uses: hynek/setup-cached-uv@v2 - - run: python -Im pip install tox - - - run: python -Im tox run -e docset + - run: uvx --with=tox-uv tox run -e docset - - uses: actions/upload-artifact@v3 + - uses: actions/upload-artifact@v4 with: name: docset path: attrs.tgz diff --git a/tools/third_party/attrs/.github/workflows/ci.yml b/tools/third_party/attrs/.github/workflows/ci.yml index ca816aa601cbea..70f3fadfbcf3ab 100644 --- a/tools/third_party/attrs/.github/workflows/ci.yml +++ b/tools/third_party/attrs/.github/workflows/ci.yml @@ -7,155 +7,205 @@ on: branches: [main] tags: ["*"] pull_request: - branches: [main] workflow_dispatch: env: - FORCE_COLOR: "1" # Make tools pretty. + FORCE_COLOR: "1" PIP_DISABLE_PIP_VERSION_CHECK: "1" PIP_NO_PYTHON_VERSION_WARNING: "1" - # Use oldest version used in doctests / examples. - SETUPTOOLS_SCM_PRETEND_VERSION: "19.2.0" permissions: {} + jobs: + build-package: + name: Build & verify package + runs-on: ubuntu-latest + + steps: + - uses: actions/checkout@v4 + with: + fetch-depth: 0 + persist-credentials: false + + - uses: hynek/build-and-inspect-python-package@v2 + id: baipp + + outputs: + # Used to define the matrix for tests below. The value is based on + # packaging metadata (trove classifiers). + supported-python-versions: ${{ steps.baipp.outputs.supported_python_classifiers_json_array }} + tests: name: Tests & Mypy on ${{ matrix.python-version }} runs-on: ubuntu-latest + needs: build-package strategy: fail-fast: false matrix: - python-version: - - "3.7" - - "3.8" - - "3.9" - - "3.10" - - "3.11" - - "3.12" - # - "pypy-3.7" - - "pypy-3.8" - - "pypy-3.9" - - "pypy-3.10" + # Created by the build-and-inspect-python-package action above. + python-version: ${{ fromJson(needs.build-package.outputs.supported-python-versions) }} steps: - - uses: actions/checkout@v4 - - uses: actions/setup-python@v4 + - name: Download pre-built packages + uses: actions/download-artifact@v4 with: - python-version: ${{ matrix.python-version }} - allow-prereleases: true - cache: pip + name: Packages + path: dist + - run: tar xf dist/*.tar.gz --strip-components=1 + - uses: hynek/setup-cached-uv@v2 - name: Prepare tox + env: + V: ${{ matrix.python-version }} run: | - V=${{ matrix.python-version }} - - if [[ "$V" = pypy-* ]]; then - V=pypy3 - IS_PYPY=1 - else - V=py$(echo $V | tr -d .) - IS_PYPY=0 + DO_MYPY=1 + + if [[ "$V" == "3.8" || "$V" == "3.9" ]]; then + DO_MYPY=0 fi - echo IS_PYPY=$IS_PYPY >>$GITHUB_ENV - echo TOX_PYTHON=$V >>$GITHUB_ENV + echo DO_MYPY=$DO_MYPY >>$GITHUB_ENV + echo TOX_PYTHON=py$(echo $V | tr -d .) >>$GITHUB_ENV + + - run: > + uvx --with=tox-uv + tox run + -e $TOX_PYTHON-mypy + if: env.DO_MYPY == '1' - python -Im pip install tox + - name: Remove src to ensure tests run against wheel + run: rm -rf src - - run: python -Im tox run -e ${{ env.TOX_PYTHON }}-tests - - run: python -Im tox run -e ${{ env.TOX_PYTHON }}-mypy - if: env.IS_PYPY == '0' && matrix.python-version != '3.7' + - run: > + uvx --with=tox-uv + tox run + --installpkg dist/*.whl + -e $TOX_PYTHON-tests - name: Upload coverage data - uses: actions/upload-artifact@v3 + uses: actions/upload-artifact@v4 with: - name: coverage-data + name: coverage-data-${{ matrix.python-version }} path: .coverage.* + include-hidden-files: true if-no-files-found: ignore + tests-pypy: + name: Tests on ${{ matrix.python-version }} + runs-on: ubuntu-latest + needs: build-package + + strategy: + fail-fast: false + matrix: + python-version: + - pypy-3.10 + + steps: + - name: Download pre-built packages + uses: actions/download-artifact@v4 + with: + name: Packages + path: dist + - run: | + tar xf dist/*.tar.gz --strip-components=1 + rm -rf src # ensure tests run against wheel + - uses: hynek/setup-cached-uv@v2 + + - run: > + uvx --with=tox-uv + tox run + --installpkg dist/*.whl + -e pypy3-tests + coverage: - name: Combine & check coverage. + name: Combine & check coverage runs-on: ubuntu-latest needs: tests steps: - - uses: actions/checkout@v4 - - uses: actions/setup-python@v4 + - name: Download pre-built packages + uses: actions/download-artifact@v4 with: - python-version-file: .python-version-default - cache: pip + name: Packages + path: dist + - run: tar xf dist/*.tar.gz --strip-components=1 + - uses: hynek/setup-cached-uv@v2 - name: Download coverage data - uses: actions/download-artifact@v3 + uses: actions/download-artifact@v4 with: - name: coverage-data + pattern: coverage-data-* + merge-multiple: true - name: Combine coverage & fail if it's <100%. run: | - python -Im pip install coverage[toml] + uv tool install --python $(cat .python-version-default) coverage - python -Im coverage combine - python -Im coverage html --skip-covered --skip-empty + coverage combine + coverage html --skip-covered --skip-empty # Report and write to summary. - python -Im coverage report --format=markdown >> $GITHUB_STEP_SUMMARY + coverage report --format=markdown >> $GITHUB_STEP_SUMMARY # Report again and fail if under 100%. - python -Im coverage report --fail-under=100 + coverage report --fail-under=100 - name: Upload HTML report if check failed. - uses: actions/upload-artifact@v3 + uses: actions/upload-artifact@v4 with: name: html-report path: htmlcov if: ${{ failure() }} docs: - name: Build docs & run doctests + name: Run doctests & render changelog runs-on: ubuntu-latest + needs: build-package steps: - - uses: actions/checkout@v4 - - uses: actions/setup-python@v4 + - name: Download pre-built packages + uses: actions/download-artifact@v4 with: - # Keep in sync with tox/docs and .readthedocs.yaml. - python-version: "3.12" - cache: pip + name: Packages + path: dist + - run: tar xf dist/*.tar.gz --strip-components=1 + - uses: hynek/setup-cached-uv@v2 - - run: python -Im pip install tox - - run: python -Im tox run -e docs,changelog + - run: uvx --with=tox-uv tox run -e docs-doctests,changelog pyright: name: Check types using pyright runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - - uses: actions/setup-python@v4 with: - python-version-file: .python-version-default - cache: pip + persist-credentials: false + - uses: hynek/setup-cached-uv@v2 - - run: python -Im pip install tox - - run: python -Im tox run -e pyright + - run: > + uvx --with=tox-uv + --python $(cat .python-version-default) + tox run -e pyright install-dev: name: Verify dev env - runs-on: ${{ matrix.os }} - strategy: - matrix: - os: [ubuntu-latest, windows-latest] + runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - - uses: actions/setup-python@v4 with: - python-version-file: .python-version-default - cache: pip + persist-credentials: false + - uses: hynek/setup-cached-uv@v2 + + - run: uv venv --python $(cat .python-version-default) + - run: uv pip install -e .[dev] - - name: Install in dev mode & import + - name: Ensure we can import attr and attrs packages run: | - python -Im pip install -e .[dev] + source .venv/bin/activate + python -Ic 'import attr; print(attr.__version__)' python -Ic 'import attrs; print(attrs.__version__)' @@ -165,6 +215,7 @@ jobs: needs: - coverage + - tests-pypy - docs - install-dev - pyright diff --git a/tools/third_party/attrs/.github/workflows/codeql-analysis.yml b/tools/third_party/attrs/.github/workflows/codeql-analysis.yml index f75fafa5be3438..fd850571f38c6d 100644 --- a/tools/third_party/attrs/.github/workflows/codeql-analysis.yml +++ b/tools/third_party/attrs/.github/workflows/codeql-analysis.yml @@ -25,11 +25,16 @@ jobs: steps: - name: Checkout repository uses: actions/checkout@v4 + with: + persist-credentials: false - name: Initialize CodeQL - uses: github/codeql-action/init@v2 + uses: github/codeql-action/init@v3 with: languages: ${{ matrix.language }} + - name: Autobuild + uses: github/codeql-action/autobuild@v3 + - name: Perform CodeQL Analysis - uses: github/codeql-action/analyze@v2 + uses: github/codeql-action/analyze@v3 diff --git a/tools/third_party/attrs/.github/workflows/codspeed.yml b/tools/third_party/attrs/.github/workflows/codspeed.yml new file mode 100644 index 00000000000000..0fc5d59b2b6245 --- /dev/null +++ b/tools/third_party/attrs/.github/workflows/codspeed.yml @@ -0,0 +1,45 @@ +--- +name: CodSpeed Benchmarks + +on: + push: + branches: [main] + tags: ["*"] + paths: + - src/**.py + - bench/** + - .github/workflows/codspeed.yml + pull_request: + paths: + - src/**.py + - bench/** + - .github/workflows/codspeed.yml + workflow_dispatch: + + +env: + FORCE_COLOR: "1" + PIP_DISABLE_PIP_VERSION_CHECK: "1" + PIP_NO_PYTHON_VERSION_WARNING: "1" + +permissions: {} + +jobs: + codspeed: + name: Run CodSpeed benchmarks + runs-on: ubuntu-latest + + steps: + - uses: actions/checkout@v4 + with: + persist-credentials: false + - uses: actions/setup-python@v5 + with: + python-version-file: .python-version-default + - uses: hynek/setup-cached-uv@v2 + + - name: Run CodSpeed benchmarks + uses: CodSpeedHQ/action@v3 + with: + token: ${{ secrets.CODSPEED_TOKEN }} + run: uvx --with tox-uv tox run -e codspeed diff --git a/tools/third_party/attrs/.github/workflows/pypi-package.yml b/tools/third_party/attrs/.github/workflows/pypi-package.yml index 8495480c1bcb5f..48ac87ee716f03 100644 --- a/tools/third_party/attrs/.github/workflows/pypi-package.yml +++ b/tools/third_party/attrs/.github/workflows/pypi-package.yml @@ -1,33 +1,35 @@ --- -name: Build & maybe upload PyPI package +name: Build & upload PyPI package on: push: branches: [main] tags: ["*"] - pull_request: - branches: [main] release: types: - published workflow_dispatch: -permissions: - contents: read - id-token: write jobs: # Always build & lint package. build-package: name: Build & verify package runs-on: ubuntu-latest + permissions: + attestations: write + id-token: write steps: - uses: actions/checkout@v4 with: fetch-depth: 0 + persist-credentials: false - uses: hynek/build-and-inspect-python-package@v2 + with: + attest-build-provenance-github: 'true' + # Upload to Test PyPI on every commit on main. release-test-pypi: @@ -37,6 +39,9 @@ jobs: runs-on: ubuntu-latest needs: build-package + permissions: + id-token: write + steps: - name: Download packages built by build-and-inspect-python-package uses: actions/download-artifact@v4 @@ -47,8 +52,10 @@ jobs: - name: Upload package to Test PyPI uses: pypa/gh-action-pypi-publish@release/v1 with: + attestations: true repository-url: https://test.pypi.org/legacy/ + # Upload to real PyPI on GitHub Releases. release-pypi: name: Publish released package to pypi.org @@ -57,6 +64,9 @@ jobs: runs-on: ubuntu-latest needs: build-package + permissions: + id-token: write + steps: - name: Download packages built by build-and-inspect-python-package uses: actions/download-artifact@v4 @@ -66,3 +76,5 @@ jobs: - name: Upload package to PyPI uses: pypa/gh-action-pypi-publish@release/v1 + with: + attestations: true diff --git a/tools/third_party/attrs/.github/workflows/zizmor.yml b/tools/third_party/attrs/.github/workflows/zizmor.yml new file mode 100644 index 00000000000000..cde16a0a4f7a9c --- /dev/null +++ b/tools/third_party/attrs/.github/workflows/zizmor.yml @@ -0,0 +1,39 @@ +# https://github.com/woodruffw/zizmor +name: Zizmor + +on: + push: + branches: ["main"] + pull_request: + branches: ["*"] + +permissions: + contents: read + + +jobs: + zizmor: + name: Zizmor latest via PyPI + runs-on: ubuntu-latest + permissions: + security-events: write + steps: + - name: Checkout repository + uses: actions/checkout@v4 + with: + persist-credentials: false + - uses: hynek/setup-cached-uv@v2 + + - name: Run zizmor 🌈 + run: uvx zizmor --format sarif . > results.sarif + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + - name: Upload SARIF file + uses: github/codeql-action/upload-sarif@v3 + with: + # Path to SARIF file relative to the root of the repository + sarif_file: results.sarif + # Optional category for the results + # Used to differentiate multiple results for one commit + category: zizmor diff --git a/tools/third_party/attrs/.pre-commit-config.yaml b/tools/third_party/attrs/.pre-commit-config.yaml index df18314431670c..24d8bdc5cb627a 100644 --- a/tools/third_party/attrs/.pre-commit-config.yaml +++ b/tools/third_party/attrs/.pre-commit-config.yaml @@ -3,25 +3,27 @@ ci: autoupdate_schedule: monthly repos: - - repo: https://github.com/psf/black - rev: 23.12.1 - hooks: - - id: black - - repo: https://github.com/astral-sh/ruff-pre-commit - rev: v0.1.9 + rev: v0.9.10 hooks: - id: ruff args: [--fix, --exit-non-zero-on-fix] + - id: ruff-format - repo: https://github.com/econchick/interrogate - rev: 1.5.0 + rev: 1.7.0 hooks: - id: interrogate args: [tests] + - repo: https://github.com/codespell-project/codespell + rev: v2.4.1 + hooks: + - id: codespell + args: [--exclude-file=tests/test_mypy.yml] + - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v4.5.0 + rev: v5.0.0 hooks: - id: trailing-whitespace - id: end-of-file-fixer diff --git a/tools/third_party/attrs/.python-version-default b/tools/third_party/attrs/.python-version-default index e4fba218358722..24ee5b1be9961e 100644 --- a/tools/third_party/attrs/.python-version-default +++ b/tools/third_party/attrs/.python-version-default @@ -1 +1 @@ -3.12 +3.13 diff --git a/tools/third_party/attrs/.readthedocs.yaml b/tools/third_party/attrs/.readthedocs.yaml index 53bc38f7ee3a10..741e9946e3afaa 100644 --- a/tools/third_party/attrs/.readthedocs.yaml +++ b/tools/third_party/attrs/.readthedocs.yaml @@ -2,14 +2,20 @@ version: 2 build: - os: ubuntu-22.04 + os: ubuntu-lts-latest tools: - # Keep version in sync with tox.ini/docs and ci.yml/docs. - python: "3.12" + # Keep version in sync with tox.ini/docs. + python: "3.13" + jobs: + create_environment: + # Need the tags to calculate the version (sometimes). + - git fetch --tags -python: - install: - - method: pip - path: . - extra_requirements: - - docs + - asdf plugin add uv + - asdf install uv latest + - asdf global uv latest + + build: + html: + - uvx --with tox-uv tox run -e docs-sponsors + - uvx --with tox-uv tox run -e docs-build -- $READTHEDOCS_OUTPUT diff --git a/tools/third_party/attrs/CHANGELOG.md b/tools/third_party/attrs/CHANGELOG.md index a768197ae1d815..fc25bf1fe2562b 100644 --- a/tools/third_party/attrs/CHANGELOG.md +++ b/tools/third_party/attrs/CHANGELOG.md @@ -1,17 +1,145 @@ # Changelog -Versions follow [CalVer](https://calver.org) with a strict backwards-compatibility policy. +Versions follow [Calendar Versioning](https://calver.org) with a strict backwards-compatibility policy. The **first number** of the version is the year. The **second number** is incremented with each release, starting at 1 for each year. The **third number** is when we need to start branches for older releases (only for emergencies). -You can find out backwards-compatibility policy [here](https://github.com/python-attrs/attrs/blob/main/.github/SECURITY.md). +You can find our backwards-compatibility policy [here](https://github.com/python-attrs/attrs/blob/main/.github/SECURITY.md). -Changes for the upcoming release can be found in the ["changelog.d" directory](https://github.com/python-attrs/attrs/tree/main/changelog.d) in our repository. +Changes for the upcoming release can be found in the [`changelog.d` directory](https://github.com/python-attrs/attrs/tree/main/changelog.d) in our repository. +## [25.3.0](https://github.com/python-attrs/attrs/tree/25.3.0) - 2025-03-13 + +### Changes + +- Restore support for generator-based `field_transformer`s. + [#1417](https://github.com/python-attrs/attrs/issues/1417) + + +## [25.2.0](https://github.com/python-attrs/attrs/tree/25.2.0) - 2025-03-12 + +### Changes + +- Checking mandatory vs non-mandatory attribute order is now performed after the field transformer, since the field transformer may change attributes and/or their order. + [#1147](https://github.com/python-attrs/attrs/issues/1147) +- `attrs.make_class()` now allows for Unicode class names. + [#1406](https://github.com/python-attrs/attrs/issues/1406) +- Speed up class creation by 30%-50% by compiling methods only once and using a variety of other techniques. + [#1407](https://github.com/python-attrs/attrs/issues/1407) +- The error message if an attribute has both an annotation and a type argument will now disclose _what_ attribute seems to be the problem. + [#1410](https://github.com/python-attrs/attrs/issues/1410) + + +## [25.1.0](https://github.com/python-attrs/attrs/tree/25.1.0) - 2025-01-25 + +### Changes + +- This release only ensures correct PyPI licensing metadata. + [#1386](https://github.com/python-attrs/attrs/issues/1386) + + +## [24.3.0](https://github.com/python-attrs/attrs/tree/24.3.0) - 2024-12-16 + +### Backwards-incompatible Changes + +- Python 3.7 has been dropped. + [#1340](https://github.com/python-attrs/attrs/issues/1340) + + +### Changes + +- Introduce `attrs.NothingType`, for annotating types consistent with `attrs.NOTHING`. + [#1358](https://github.com/python-attrs/attrs/issues/1358) +- Allow mutating `__suppress_context__` and `__notes__` on frozen exceptions. + [#1365](https://github.com/python-attrs/attrs/issues/1365) +- `attrs.converters.optional()` works again when taking `attrs.converters.pipe()` or another Converter as its argument. + [#1372](https://github.com/python-attrs/attrs/issues/1372) +- *attrs* instances now support [`copy.replace()`](https://docs.python.org/3/library/copy.html#copy.replace). + [#1383](https://github.com/python-attrs/attrs/issues/1383) +- `attrs.validators.instance_of()`'s type hints now allow for union types. + For example: `instance_of(str | int)` + [#1385](https://github.com/python-attrs/attrs/issues/1385) + + +## [24.2.0](https://github.com/python-attrs/attrs/tree/24.2.0) - 2024-08-06 + +### Deprecations + +- Given the amount of warnings raised in the broader ecosystem, we've decided to only soft-deprecate the *hash* argument to `@define` / `@attr.s`. + Please don't use it in new code, but we don't intend to remove it anymore. + [#1330](https://github.com/python-attrs/attrs/issues/1330) + + +### Changes + +- `attrs.converters.pipe()` (and its syntactic sugar of passing a list for `attrs.field()`'s / `attr.ib()`'s *converter* argument) works again when passing `attrs.setters.convert` to *on_setattr* (which is default for `attrs.define`). + [#1328](https://github.com/python-attrs/attrs/issues/1328) +- Restored support for PEP [649](https://peps.python.org/pep-0649/) / [749](https://peps.python.org/pep-0749/)-implementing Pythons -- currently 3.14-dev. + [#1329](https://github.com/python-attrs/attrs/issues/1329) + + +## [24.1.0](https://github.com/python-attrs/attrs/tree/24.1.0) - 2024-08-03 + +### Backwards-incompatible Changes + +- `attrs.evolve()` doesn't accept the *inst* argument as a keyword argument anymore. + Pass it as the first positional argument instead. + [#1264](https://github.com/python-attrs/attrs/issues/1264) +- `attrs.validators.provides()` has been removed. + The removed code is available as a [gist](https://gist.github.com/hynek/9eaaaeb659808f3519870dfa16d2b6b2) for convenient copy and pasting. + [#1265](https://github.com/python-attrs/attrs/issues/1265) +- All packaging metadata except from `__version__` and `__version_info__` has been removed from the `attr` and `attrs` modules (for example, `attrs.__url__`). + + Please use [`importlib.metadata`](https://docs.python.org/3/library/importlib.metadata.html) or [*importlib-metadata*](https://pypi.org/project/importlib-metadata/) instead. + [#1268](https://github.com/python-attrs/attrs/issues/1268) +- The generated `__eq__` methods have been sped up significantly by generating a chain of attribute comparisons instead of constructing and comparing tuples. + This change arguably makes the behavior more correct, + but changes it if an attribute compares equal by identity but not value, like `float('nan')`. + [#1310](https://github.com/python-attrs/attrs/issues/1310) + + +### Deprecations + +- The *repr_ns* argument to `attr.s` is now deprecated. + It was a workaround for nested classes in Python 2 and is pointless in Python 3. + [#1263](https://github.com/python-attrs/attrs/issues/1263) +- The *hash* argument to `@attr.s`, `@attrs.define`, and `make_class()` is now deprecated in favor of *unsafe_hash*, as defined by PEP 681. + [#1323](https://github.com/python-attrs/attrs/issues/1323) + + +### Changes + +- Allow original slotted `functools.cached_property` classes to be cleaned by garbage collection. + Allow `super()` calls in slotted cached properties. + [#1221](https://github.com/python-attrs/attrs/issues/1221) +- Our type stubs now use modern type notation and are organized such that VS Code's quick-fix prefers the `attrs` namespace. + [#1234](https://github.com/python-attrs/attrs/issues/1234) +- Preserve `AttributeError` raised by properties of slotted classes with `functools.cached_properties`. + [#1253](https://github.com/python-attrs/attrs/issues/1253) +- It is now possible to wrap a converter into an `attrs.Converter` and get the current instance and/or the current field definition passed into the converter callable. + + Note that this is not supported by any type checker, yet. + [#1267](https://github.com/python-attrs/attrs/issues/1267) +- `attrs.make_class()` now populates the `__annotations__` dict of the generated class, so that `attrs.resolve_types()` can resolve them. + [#1285](https://github.com/python-attrs/attrs/issues/1285) +- Added the `attrs.validators.or_()` validator. + [#1303](https://github.com/python-attrs/attrs/issues/1303) +- The combination of a `__attrs_pre_init__` that takes arguments, a kw-only field, and a default on that field does not crash anymore. + [#1319](https://github.com/python-attrs/attrs/issues/1319) +- `attrs.validators.in_()` now transforms certain unhashable options to tuples to keep the field hashable. + + This allows fields that use this validator to be used with, for example, `attrs.filters.include()`. + [#1320](https://github.com/python-attrs/attrs/issues/1320) +- If a class has an *inherited* method called `__attrs_init_subclass__`, it is now called once the class is done assembling. + + This is a replacement for Python's `__init_subclass__` and useful for registering classes, and similar. + [#1321](https://github.com/python-attrs/attrs/issues/1321) + + ## [23.2.0](https://github.com/python-attrs/attrs/tree/23.2.0) - 2023-12-31 ### Changes @@ -34,6 +162,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h It is, for example, now possible to attach methods. [#1203](https://github.com/python-attrs/attrs/issues/1203) + ## [23.1.0](https://github.com/python-attrs/attrs/tree/23.1.0) - 2023-04-16 ### Backwards-incompatible Changes @@ -46,7 +175,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h - The support for *zope-interface* via the `attrs.validators.provides` validator is now deprecated and will be removed in, or after, April 2024. - The presence of a C-based package in our developement dependencies has caused headaches and we're not under the impression it's used a lot. + The presence of a C-based package in our development dependencies has caused headaches and we're not under the impression it's used a lot. Let us know if you're using it and we might publish it as a separate package. [#1120](https://github.com/python-attrs/attrs/issues/1120) @@ -58,7 +187,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h [#1068](https://github.com/python-attrs/attrs/issues/1068) - `attrs.has()` and `attrs.fields()` now handle generic classes correctly. [#1079](https://github.com/python-attrs/attrs/issues/1079) -- Fix frozen exception classes when raised within e.g. `contextlib.contextmanager`, which mutates their `__traceback__` attributes. +- Fix frozen exception classes when raised within, for example, `contextlib.contextmanager`, which mutates their `__traceback__` attributes. [#1081](https://github.com/python-attrs/attrs/issues/1081) - `@frozen` now works with type checkers that implement [PEP-681](https://peps.python.org/pep-0681/) (ex. [pyright](https://github.com/microsoft/pyright/)). [#1084](https://github.com/python-attrs/attrs/issues/1084) @@ -105,7 +234,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h Get `__init__` signatures matching any taste, peculiar or plain! The [PEP 681 compatible](https://peps.python.org/pep-0681/#field-specifier-parameters) *alias* option can be use to override private attribute name mangling, or add other arbitrary field argument name overrides. [#950](https://github.com/python-attrs/attrs/issues/950) -- `attrs.NOTHING` is now an enum value, making it possible to use with e.g. [`typing.Literal`](https://docs.python.org/3/library/typing.html#typing.Literal). +- `attrs.NOTHING` is now an enum value, making it possible to use with, for example, [`typing.Literal`](https://docs.python.org/3/library/typing.html#typing.Literal). [#983](https://github.com/python-attrs/attrs/issues/983) - Added missing re-import of `attr.AttrsInstance` to the `attrs` namespace. [#987](https://github.com/python-attrs/attrs/issues/987) @@ -118,7 +247,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h - `attrs.has()` is now a [`TypeGuard`](https://docs.python.org/3/library/typing.html#typing.TypeGuard) for `AttrsInstance`. That means that type checkers know a class is an instance of an `attrs` class if you check it using `attrs.has()` (or `attr.has()`) first. [#997](https://github.com/python-attrs/attrs/issues/997) -- Made `attrs.AttrsInstance` stub available at runtime and fixed type errors related to the usage of `attrs.AttrsInstance` in *Pyright*. +- Made `attrs.AttrsInstance` stub available at runtime and fixed type errors related to the usage of `attrs.AttrsInstance` in Pyright. [#999](https://github.com/python-attrs/attrs/issues/999) - On Python 3.10 and later, call [`abc.update_abstractmethods()`](https://docs.python.org/3/library/abc.html#abc.update_abstractmethods) on dict classes after creation. This improves the detection of abstractness. @@ -187,7 +316,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h ### Backward-incompatible Changes - When using `@define`, converters are now run by default when setting an attribute on an instance -- additionally to validators. - I.e. the new default is `on_setattr=[attrs.setters.convert, attrs.setters.validate]`. + Meaning: the new default is `on_setattr=[attrs.setters.convert, attrs.setters.validate]`. This is unfortunately a breaking change, but it was an oversight, impossible to raise a `DeprecationWarning` about, and it's better to fix it now while the APIs are very fresh with few users. [#835](https://github.com/python-attrs/attrs/issues/835), @@ -486,7 +615,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h - `attrs` can now automatically detect your own implementations and infer `init=False`, `repr=False`, `eq=False`, `order=False`, and `hash=False` if you set `@attr.s(auto_detect=True)`. `attrs` will ignore inherited methods. - If the argument implies more than one method (e.g. `eq=True` creates both `__eq__` and `__ne__`), it's enough for *one* of them to exist and `attrs` will create *neither*. + If the argument implies more than one method (for example, `eq=True` creates both `__eq__` and `__ne__`), it's enough for *one* of them to exist and `attrs` will create *neither*. This feature requires Python 3. [#607](https://github.com/python-attrs/attrs/issues/607) @@ -719,7 +848,7 @@ Changes for the upcoming release can be found in the ["changelog.d" directory](h [#290](https://github.com/python-attrs/attrs/issues/290), [#349](https://github.com/python-attrs/attrs/issues/349) -- The order of attributes that are passed into `attr.make_class()` or the *these* argument of `@attr.s()` is now retained if the dictionary is ordered (i.e. `dict` on Python 3.6 and later, `collections.OrderedDict` otherwise). +- The order of attributes that are passed into `attr.make_class()` or the *these* argument of `@attr.s()` is now retained if the dictionary is ordered (in other words: `dict` on Python 3.6 and later, `collections.OrderedDict` otherwise). Before, the order was always determined by the order in which the attributes have been defined which may not be desirable when creating classes programmatically. @@ -1097,7 +1226,7 @@ To encourage more participation, the project has also been moved into a [dedicat ### Changes: - Added a `convert` argument to `attr.ib`, which allows specifying a function to run on arguments. - This allows for simple type conversions, e.g. with `attr.ib(convert=int)`. + This allows for simple type conversions, for example, with `attr.ib(convert=int)`. [#26](https://github.com/python-attrs/attrs/issues/26) - Speed up object creation when attribute validators are used. [#28](https://github.com/python-attrs/attrs/issues/28) diff --git a/tools/third_party/attrs/README.md b/tools/third_party/attrs/README.md index 6ef8c0204eaca0..5a4479660ae2c7 100644 --- a/tools/third_party/attrs/README.md +++ b/tools/third_party/attrs/README.md @@ -9,7 +9,6 @@

Documentation - License: MIT Downloads per month @@ -29,12 +28,29 @@ Its main goal is to help you to write **concise** and **correct** software witho *attrs* would not be possible without our [amazing sponsors](https://github.com/sponsors/hynek). Especially those generously supporting us at the *The Organization* tier and higher: + +

- - - + + + + + + + + + + +

+ +

Please consider joining them to help make attrs’s maintenance more sustainable!

@@ -91,26 +107,37 @@ After *declaring* your attributes, *attrs* gives you: *without* writing dull boilerplate code again and again and *without* runtime performance penalties. -**Hate type annotations**!? -No problem! -Types are entirely **optional** with *attrs*. -Simply assign `attrs.field()` to the attributes instead of annotating them with types. - --- This example uses *attrs*'s modern APIs that have been introduced in version 20.1.0, and the *attrs* package import name that has been added in version 21.3.0. The classic APIs (`@attr.s`, `attr.ib`, plus their serious-business aliases) and the `attr` package import name will remain **indefinitely**. -Please check out [*On The Core API Names*](https://www.attrs.org/en/latest/names.html) for a more in-depth explanation. +Check out [*On The Core API Names*](https://www.attrs.org/en/latest/names.html) for an in-depth explanation! + + +### Hate Type Annotations!? + +No problem! +Types are entirely **optional** with *attrs*. +Simply assign `attrs.field()` to the attributes instead of annotating them with types: + +```python +from attrs import define, field + +@define +class SomeClass: + a_number = field(default=42) + list_of_numbers = field(factory=list) +``` ## Data Classes On the tin, *attrs* might remind you of `dataclasses` (and indeed, `dataclasses` [are a descendant](https://hynek.me/articles/import-attrs/) of *attrs*). In practice it does a lot more and is more flexible. -For instance it allows you to define [special handling of NumPy arrays for equality checks](https://www.attrs.org/en/stable/comparison.html#customization), allows more ways to [plug into the initialization process](https://www.attrs.org/en/stable/init.html#hooking-yourself-into-initialization), and allows for stepping through the generated methods using a debugger. +For instance, it allows you to define [special handling of NumPy arrays for equality checks](https://www.attrs.org/en/stable/comparison.html#customization), allows more ways to [plug into the initialization process](https://www.attrs.org/en/stable/init.html#hooking-yourself-into-initialization), has a replacement for `__init_subclass__`, and allows for stepping through the generated methods using a debugger. -For more details, please refer to our [comparison page](https://www.attrs.org/en/stable/why.html#data-classes). +For more details, please refer to our [comparison page](https://www.attrs.org/en/stable/why.html#data-classes), but generally speaking, we are more likely to commit crimes against nature to make things work that one would expect to work, but that are quite complicated in practice. ## Project Information @@ -121,13 +148,12 @@ For more details, please refer to our [comparison page](https://www.attrs.org/en - [**Source Code**](https://github.com/python-attrs/attrs) - [**Contributing**](https://github.com/python-attrs/attrs/blob/main/.github/CONTRIBUTING.md) - [**Third-party Extensions**](https://github.com/python-attrs/attrs/wiki/Extensions-to-attrs) -- **Get Help**: please use the `python-attrs` tag on [StackOverflow](https://stackoverflow.com/questions/tagged/python-attrs) +- **Get Help**: use the `python-attrs` tag on [Stack Overflow](https://stackoverflow.com/questions/tagged/python-attrs) ### *attrs* for Enterprise -Available as part of the Tidelift Subscription. +Available as part of the [Tidelift Subscription](https://tidelift.com/?utm_source=lifter&utm_medium=referral&utm_campaign=hynek). The maintainers of *attrs* and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source packages you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact packages you use. -[Learn more.](https://tidelift.com/subscription/pkg/pypi-attrs?utm_source=pypi-attrs&utm_medium=referral&utm_campaign=enterprise&utm_term=repo) diff --git a/tools/third_party/attrs/bench/test_benchmarks.py b/tools/third_party/attrs/bench/test_benchmarks.py new file mode 100644 index 00000000000000..17916caf33d75e --- /dev/null +++ b/tools/third_party/attrs/bench/test_benchmarks.py @@ -0,0 +1,121 @@ +""" +Benchmark attrs using CodSpeed. +""" + +from __future__ import annotations + +import pytest + +import attrs + + +pytestmark = pytest.mark.benchmark() + +ROUNDS = 1_000 + + +def test_create_simple_class(): + """ + Benchmark creating a simple class without any extras. + """ + for _ in range(ROUNDS): + + @attrs.define + class LocalC: + x: int + y: str + z: dict[str, int] + + +def test_create_frozen_class(): + """ + Benchmark creating a frozen class without any extras. + """ + for _ in range(ROUNDS): + + @attrs.frozen + class LocalC: + x: int + y: str + z: dict[str, int] + + LocalC(1, "2", {}) + + +def test_create_simple_class_make_class(): + """ + Benchmark creating a simple class using attrs.make_class(). + """ + for i in range(ROUNDS): + LocalC = attrs.make_class( + f"LocalC{i}", + { + "x": attrs.field(type=int), + "y": attrs.field(type=str), + "z": attrs.field(type=dict[str, int]), + }, + ) + + LocalC(1, "2", {}) + + +@attrs.define +class C: + x: int = 0 + y: str = "foo" + z: dict[str, int] = attrs.Factory(dict) + + +def test_instantiate_no_defaults(): + """ + Benchmark instantiating a class without using any defaults. + """ + for _ in range(ROUNDS): + C(1, "2", {}) + + +def test_instantiate_with_defaults(): + """ + Benchmark instantiating a class relying on defaults. + """ + for _ in range(ROUNDS): + C() + + +def test_eq_equal(): + """ + Benchmark comparing two equal instances for equality. + """ + c1 = C() + c2 = C() + + for _ in range(ROUNDS): + c1 == c2 + + +def test_eq_unequal(): + """ + Benchmark comparing two unequal instances for equality. + """ + c1 = C() + c2 = C(1, "bar", {"baz": 42}) + + for _ in range(ROUNDS): + c1 == c2 + + +@attrs.frozen +class HashableC: + x: int = 0 + y: str = "foo" + z: tuple[str] = ("bar",) + + +def test_hash(): + """ + Benchmark hashing an instance. + """ + c = HashableC() + + for _ in range(ROUNDS): + hash(c) diff --git a/tools/third_party/attrs/changelog.d/towncrier_template.md.jinja b/tools/third_party/attrs/changelog.d/towncrier_template.md.jinja index d9ae7c10efc162..d07a85984c95b9 100644 --- a/tools/third_party/attrs/changelog.d/towncrier_template.md.jinja +++ b/tools/third_party/attrs/changelog.d/towncrier_template.md.jinja @@ -25,4 +25,5 @@ No significant changes. {% endif %} + {% endfor %} diff --git a/tools/third_party/attrs/conftest.py b/tools/third_party/attrs/conftest.py index 144e5f3e19f8fd..e22f79d53c76c9 100644 --- a/tools/third_party/attrs/conftest.py +++ b/tools/third_party/attrs/conftest.py @@ -1,10 +1,12 @@ # SPDX-License-Identifier: MIT +from datetime import timedelta + import pytest from hypothesis import HealthCheck, settings -from attr._compat import PY310 +from attr._compat import PY_3_10_PLUS @pytest.fixture(name="slots", params=(True, False)) @@ -20,11 +22,15 @@ def _frozen(request): def pytest_configure(config): # HealthCheck.too_slow causes more trouble than good -- especially in CIs. settings.register_profile( - "patience", settings(suppress_health_check=[HealthCheck.too_slow]) + "patience", + settings( + suppress_health_check=[HealthCheck.too_slow], + deadline=timedelta(milliseconds=400), + ), ) settings.load_profile("patience") collect_ignore = [] -if not PY310: +if not PY_3_10_PLUS: collect_ignore.extend(["tests/test_pattern_matching.py"]) diff --git a/tools/third_party/attrs/docs/_static/sponsors/Klaviyo.svg b/tools/third_party/attrs/docs/_static/sponsors/Klaviyo.svg new file mode 100644 index 00000000000000..6c7449bc7655f3 --- /dev/null +++ b/tools/third_party/attrs/docs/_static/sponsors/Klaviyo.svg @@ -0,0 +1 @@ + diff --git a/tools/third_party/attrs/docs/_static/sponsors/Polar.svg b/tools/third_party/attrs/docs/_static/sponsors/Polar.svg new file mode 100644 index 00000000000000..b278cb756c4288 --- /dev/null +++ b/tools/third_party/attrs/docs/_static/sponsors/Polar.svg @@ -0,0 +1,10 @@ + + + + + + + + + + diff --git a/tools/third_party/attrs/docs/_static/sponsors/Privacy-Solutions.svg b/tools/third_party/attrs/docs/_static/sponsors/Privacy-Solutions.svg new file mode 100644 index 00000000000000..cc3e553a3ba678 --- /dev/null +++ b/tools/third_party/attrs/docs/_static/sponsors/Privacy-Solutions.svg @@ -0,0 +1,13 @@ + + + + + + + + + + + + + diff --git a/tools/third_party/attrs/docs/_static/sponsors/emsys-renewables.svg b/tools/third_party/attrs/docs/_static/sponsors/emsys-renewables.svg new file mode 100644 index 00000000000000..d5738cecab408a --- /dev/null +++ b/tools/third_party/attrs/docs/_static/sponsors/emsys-renewables.svg @@ -0,0 +1 @@ + diff --git a/tools/third_party/attrs/docs/api-attr.rst b/tools/third_party/attrs/docs/api-attr.rst index 1c1c3edb3fcd6a..e0463c2157ae98 100644 --- a/tools/third_party/attrs/docs/api-attr.rst +++ b/tools/third_party/attrs/docs/api-attr.rst @@ -1,6 +1,13 @@ API Reference for the ``attr`` Namespace ======================================== +.. note:: + + These are the traditional APIs whose creation predates type annotations. + They are **not** deprecated, but we suggest using the :mod:`attrs` namespace for new code, because they look nicer and have better defaults. + + See also :doc:`names`. + .. module:: attr @@ -9,10 +16,6 @@ Core .. autofunction:: attr.s(these=None, repr_ns=None, repr=None, cmp=None, hash=None, init=None, slots=False, frozen=False, weakref_slot=True, str=False, auto_attribs=False, kw_only=False, cache_hash=False, auto_exc=False, eq=None, order=None, auto_detect=False, collect_by_mro=False, getstate_setstate=None, on_setattr=None, field_transformer=None, match_args=True, unsafe_hash=None) - .. note:: - - *attrs* also comes with a serious-business alias ``attr.attrs``. - For example: .. doctest:: @@ -75,6 +78,9 @@ Core ... ValueError: x must be positive +.. function:: attrs + + Serious business alias for `attr.s`. .. function:: define @@ -136,7 +142,7 @@ Helpers .. function:: fields_dict - Same as `attr.fields_dict`. + Same as `attrs.fields_dict`. .. function:: has diff --git a/tools/third_party/attrs/docs/api.rst b/tools/third_party/attrs/docs/api.rst index d55f2539ea5da9..cd5df2d941589e 100644 --- a/tools/third_party/attrs/docs/api.rst +++ b/tools/third_party/attrs/docs/api.rst @@ -5,24 +5,22 @@ API Reference *attrs* works by decorating a class using `attrs.define` or `attr.s` and then defining attributes on the class using `attrs.field`, `attr.ib`, or type annotations. -What follows is the API explanation, if you'd like a more hands-on tutorial, have a look at `examples`. +What follows is the dry API explanation for people who understand how *attrs* works. +If you'd like a hands-on tutorial, have a look at `examples`. If you're confused by the many names, please check out `names` for clarification, but the `TL;DR `_ is that as of version 21.3.0, *attrs* consists of **two** top-level package names: - The classic ``attr`` that powers the venerable `attr.s` and `attr.ib`. - The newer ``attrs`` that only contains most modern APIs and relies on `attrs.define` and `attrs.field` to define your classes. - Additionally it offers some ``attr`` APIs with nicer defaults (e.g. `attrs.asdict`). + Additionally, some of the APIs that also exist in ``attr`` have nicer defaults (for example, `attrs.asdict`). The ``attrs`` namespace is built *on top of* ``attr`` -- which will *never* go away -- and is just as stable, since it doesn't constitute a rewrite. -To keep repetition low and this document at a reasonable size, the ``attr`` namespace is `documented on a separate page `, though. +To keep repetition low and this document at a reasonable size, the ``attr`` namespace is `documented on a separate page `. Core ---- -.. autodata:: attrs.NOTHING - :no-value: - .. autofunction:: attrs.define .. function:: mutable(same_as_define) @@ -94,6 +92,10 @@ Core C(x=[1, 2, 3], y={1, 2, 3}) +.. autodata:: attrs.NOTHING + :no-value: + + Exceptions ---------- @@ -451,8 +453,6 @@ All objects from ``attrs.validators`` are also available from ``attr.validators` ... ValueError: 'val' must be in [1, 2, 3] (got 4), Attribute(name='val', default=NOTHING, validator=, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None), [1, 2, 3], 4) -.. autofunction:: attrs.validators.provides - .. autofunction:: attrs.validators.and_ For convenience, it's also possible to pass a list to `attrs.field`'s validator argument. @@ -462,6 +462,29 @@ All objects from ``attrs.validators`` are also available from ``attr.validators` x = field(validator=attrs.validators.and_(v1, v2, v3)) x = field(validator=[v1, v2, v3]) +.. autofunction:: attrs.validators.or_ + + For example: + + .. doctest:: + + >>> @define + ... class C: + ... val: int | list[int] = field( + ... validator=attrs.validators.or_( + ... attrs.validators.instance_of(int), + ... attrs.validators.deep_iterable(attrs.validators.instance_of(int)), + ... ) + ... ) + >>> C(42) + C(val=42) + >>> C([1, 2, 3]) + C(val=[1, 2, 3]) + >>> C(val='42') + Traceback (most recent call last): + ... + ValueError: None of (>, >>) satisfied for value '42' + .. autofunction:: attrs.validators.not_ For example: @@ -609,6 +632,27 @@ Validators can be both globally and locally disabled: Converters ---------- +.. autoclass:: attrs.Converter + + For example: + + .. doctest:: + + >>> def complicated(value, self_, field): + ... return int(value) * self_.factor + field.metadata["offset"] + >>> @define + ... class C: + ... factor = 5 # not an *attrs* field + ... x = field( + ... metadata={"offset": 200}, + ... converter=attrs.Converter( + ... complicated, + ... takes_self=True, takes_field=True + ... )) + >>> C("42") + C(x=410) + + .. module:: attrs.converters All objects from ``attrs.converters`` are also available from ``attr.converters`` (it's the same module in a different namespace). @@ -652,7 +696,7 @@ All objects from ``attrs.converters`` are also available from ``attr.converters` C(x='') -.. autofunction:: attrs.converters.to_bool +.. autofunction:: attrs.converters.to_bool(val) For example: @@ -667,10 +711,11 @@ All objects from ``attrs.converters`` are also available from ``attr.converters` C(x=True) >>> C(0) C(x=False) - >>> C("foo") + >>> C("norway") Traceback (most recent call last): File "", line 1, in - ValueError: Cannot convert value to bool: foo + ValueError: Cannot convert value to bool: norway + @@ -681,7 +726,7 @@ Setters .. module:: attrs.setters -These are helpers that you can use together with `attrs.define`'s and `attrs.fields`'s ``on_setattr`` arguments. +These are helpers that you can use together with `attrs.define`'s and `attrs.field`'s ``on_setattr`` arguments. All setters in ``attrs.setters`` are also available from ``attr.setters`` (it's the same module in a different namespace). .. autofunction:: frozen @@ -714,4 +759,5 @@ All setters in ``attrs.setters`` are also available from ``attr.setters`` (it's ... attrs.exceptions.FrozenAttributeError: () - N.B. Please use `attrs.define`'s *frozen* argument (or `attrs.frozen`) to freeze whole classes; it is more efficient. + .. tip:: + Use `attrs.define`'s *frozen* argument (or `attrs.frozen`) to freeze whole classes; it is more efficient. diff --git a/tools/third_party/attrs/docs/comparison.md b/tools/third_party/attrs/docs/comparison.md index 79786e9e19ff20..b5aa136fae9a38 100644 --- a/tools/third_party/attrs/docs/comparison.md +++ b/tools/third_party/attrs/docs/comparison.md @@ -5,14 +5,15 @@ For that, *attrs* writes `__eq__` and `__ne__` methods for you. Additionally, if you pass `order=True`, *attrs* will also create a complete set of ordering methods: `__le__`, `__lt__`, `__ge__`, and `__gt__`. -Both for equality and order, *attrs* will: +For equality, *attrs* will generate a statement comparing the types of both instances, +and then comparing each attribute in turn using `==`. + +For order, *attrs* will: - Check if the types of the instances you're comparing are equal, - if so, create a tuple of all field values for each instance, - and finally perform the desired comparison operation on those tuples. -[^default]: That's the default if you use the {func}`attr.s` decorator, but not with {func}`~attrs.define`. - (custom-comparison)= ## Customization diff --git a/tools/third_party/attrs/docs/conf.py b/tools/third_party/attrs/docs/conf.py index b92354a6fd4704..04a8fe3f7b2746 100644 --- a/tools/third_party/attrs/docs/conf.py +++ b/tools/third_party/attrs/docs/conf.py @@ -1,9 +1,19 @@ # SPDX-License-Identifier: MIT +import os + from importlib import metadata from pathlib import Path +# Set canonical URL from the Read the Docs Domain +html_baseurl = os.environ.get("READTHEDOCS_CANONICAL_URL", "") + +# Tell Jinja2 templates the build is running on Read the Docs +if os.environ.get("READTHEDOCS", "") == "True": + html_context = {"READTHEDOCS": True} + + # -- Path setup ----------------------------------------------------------- PROJECT_ROOT_DIR = Path(__file__).parents[1].resolve() @@ -16,10 +26,14 @@ """ linkcheck_ignore = [ + # Fastly blocks this. + "https://pypi.org/project/attr/#history", # We run into GitHub's rate limits. r"https://github.com/.*/(issues|pull)/\d+", # Rate limits and the latest tag is missing anyways on release. "https://github.com/python-attrs/attrs/tree/.*", + # GitHub just completely broke anchors hashtag modern web dev. + "https://github.com/python-attrs/attrs/commit/88aa1c897dfe2ee4aa987e4a56f2ba1344a17238#diff-4fc63db1f2fcb7c6e464ee9a77c3c74e90dd191d1c9ffc3bdd1234d3a6663dc0R48", ] # In nitpick mode (-n), still ignore any of the following "broken" references @@ -36,6 +50,7 @@ # ones. extensions = [ "myst_parser", + "sphinx.ext.napoleon", "sphinx.ext.autodoc", "sphinx.ext.doctest", "sphinx.ext.intersphinx", @@ -97,7 +112,7 @@ "sidebar_hide_name": True, "light_logo": "attrs_logo.svg", "dark_logo": "attrs_logo_white.svg", - "top_of_page_button": None, + "top_of_page_buttons": [], "light_css_variables": { "font-stack": "Inter,sans-serif", "font-stack--monospace": "BerkeleyMono, MonoLisa, ui-monospace, " @@ -162,12 +177,12 @@ "attrs Documentation", "Hynek Schlawack", "attrs", - "Python Clases Without Boilerplate", + "Python Classes Without Boilerplate", "Miscellaneous", ) ] -epub_description = "Python Clases Without Boilerplate" +epub_description = "Python Classes Without Boilerplate" intersphinx_mapping = {"python": ("https://docs.python.org/3", None)} diff --git a/tools/third_party/attrs/docs/examples.md b/tools/third_party/attrs/docs/examples.md index 0f8301aa59700a..2393decf45b257 100644 --- a/tools/third_party/attrs/docs/examples.md +++ b/tools/third_party/attrs/docs/examples.md @@ -17,7 +17,7 @@ True False ``` -So in other words: *attrs* is useful even without actual attributes! +So in other words: *attrs* is useful even without actual {term}`fields `! But you'll usually want some data on your classes, so let's add some: @@ -112,7 +112,7 @@ This is useful in times when you want to enhance classes that are not yours (nic SomethingFromSomeoneElse(x=1) ``` -[Subclassing is bad for you](https://www.youtube.com/watch?v=3MNVP9-hglc), but *attrs* will still do what you'd hope for: +[Subclassing is bad for you](https://www.youtube.com/watch?v=3MNVP9-hglc) (except when doing [strict specialization](https://hynek.me/articles/python-subclassing-redux/)), but *attrs* will still do what you'd hope for: ```{doctest} >>> @define(slots=False) @@ -569,7 +569,7 @@ AutoC(l=[], x=1, foo='every attrib needs a type if auto_attribs=True', bar=None) The generated `__init__` method will have an attribute called `__annotations__` that contains this type information. -If your annotations contain strings (e.g. forward references), +If your annotations contain strings (for example, forward references), you can resolve these after all references have been defined by using {func}`attrs.resolve_types`. This will replace the *type* attribute in the respective fields. @@ -674,9 +674,37 @@ C(x=1, y=3) False ``` +On Python 3.13 and later, you can also use {func}`copy.replace` from the standard library: + +```{doctest} +>>> import copy +>>> @frozen +... class C: +... x: int +... y: int +>>> i = C(1, 2) +>>> copy.replace(i, y=3) +C(x=1, y=3) +``` + ## Other Goodies +When building systems that have something resembling a plugin interface, you may want to have a registry of all classes that implement a certain interface: + +```{doctest} +>>> REGISTRY = [] +>>> class Base: # does NOT have to be an attrs class! +... @classmethod +... def __attrs_init_subclass__(cls): +... REGISTRY.append(cls) +>>> @define +... class Impl(Base): +... pass +>>> REGISTRY +[] +``` + Sometimes you may want to create a class programmatically. *attrs* gives you {func}`attrs.make_class` for that: @@ -689,7 +717,7 @@ Sometimes you may want to create a class programmatically. >>> C2 = make_class("C2", {"x": field(type=int), "y": field()}) >>> fields(C1) == fields(C2) True ->>> fields(C1).x.type +>>> fields(C2).x.type ``` diff --git a/tools/third_party/attrs/docs/extending.md b/tools/third_party/attrs/docs/extending.md index c6cb5f574bc946..c3a475ae42d7f5 100644 --- a/tools/third_party/attrs/docs/extending.md +++ b/tools/third_party/attrs/docs/extending.md @@ -6,7 +6,7 @@ It's a tuple of {class}`attrs.Attribute` carrying metadata about each attribute. So it is fairly simple to build your own decorators on top of *attrs*: ```{doctest} ->>> from attr import define +>>> from attrs import define >>> def print_attrs(cls): ... print(cls.__attrs_attrs__) ... return cls @@ -50,8 +50,8 @@ Another common use case is to overwrite *attrs*'s defaults. ### Mypy -Unfortunately, decorator wrapping currently [confuses](https://github.com/python/mypy/issues/5406) mypy's *attrs* plugin. -At the moment, the best workaround is to hold your nose, write a fake *Mypy* plugin, and mutate a bunch of global variables: +Unfortunately, decorator wrapping currently [confuses](https://github.com/python/mypy/issues/5406) Mypy's *attrs* plugin. +At the moment, the best workaround is to hold your nose, write a fake Mypy plugin, and mutate a bunch of global variables: ```python from mypy.plugin import Plugin @@ -79,7 +79,7 @@ def plugin(version): return MyPlugin ``` -Then tell *Mypy* about your plugin using your project's `mypy.ini`: +Then tell Mypy about your plugin using your project's `mypy.ini`: ```ini [mypy] @@ -87,21 +87,21 @@ plugins= ``` :::{warning} -Please note that it is currently *impossible* to let mypy know that you've changed defaults like *eq* or *order*. -You can only use this trick to tell *Mypy* that a class is actually an *attrs* class. +Please note that it is currently *impossible* to let Mypy know that you've changed defaults like *eq* or *order*. +You can only use this trick to tell Mypy that a class is actually an *attrs* class. ::: ### Pyright -Generic decorator wrapping is supported in [*Pyright*](https://github.com/microsoft/pyright) via `typing.dataclass_transform` / {pep}`689`. +Generic decorator wrapping is supported in [Pyright](https://github.com/microsoft/pyright) via `typing.dataclass_transform` / {pep}`681`. For a custom wrapping of the form: ``` -@typing.dataclass_transform(field_specifiers=(attr.attrib, attr.field)) +@typing.dataclass_transform(field_specifiers=(attr.attrib, attrs.field)) def custom_define(f): - return attr.define(f) + return attrs.define(f) ``` ## Types @@ -109,16 +109,16 @@ def custom_define(f): *attrs* offers two ways of attaching type information to attributes: - {pep}`526` annotations, -- and the *type* argument to {func}`attr.ib`. +- and the *type* argument to {func}`attr.ib` / {func}`attrs.field`. This information is available to you: ```{doctest} ->>> from attr import attrib, define, field, fields +>>> from attrs import define, field, fields >>> @define ... class C: ... x: int = field() -... y = attrib(type=str) +... y = field(type=str) >>> fields(C).x.type >>> fields(C).y.type @@ -127,6 +127,10 @@ This information is available to you: Currently, *attrs* doesn't do anything with this information but it's very useful if you'd like to write your own validators or serializers! +Originally, we didn't add the *type* argument to the new {func}`attrs.field` API, because type annotations are the preferred way. +But we reintroduced it later, so `field` can be used with the {func}`attrs.make_class` function. +We strongly discourage the use of the *type* parameter outside of {func}`attrs.make_class`. + (extending-metadata)= ## Metadata @@ -214,7 +218,7 @@ For example, let's assume that you really don't like floats: Data(a=42, c='spam') ``` -A more realistic example would be to automatically convert data that you, e.g., load from JSON: +A more realistic example would be to automatically convert data that you, for example, load from JSON: ```{doctest} >>> from datetime import datetime diff --git a/tools/third_party/attrs/docs/glossary.md b/tools/third_party/attrs/docs/glossary.md index 6b09a3ad4d12ab..45fca3687b0a35 100644 --- a/tools/third_party/attrs/docs/glossary.md +++ b/tools/third_party/attrs/docs/glossary.md @@ -10,12 +10,14 @@ dunder methods Its first documented use is a [mailing list posting](https://mail.python.org/pipermail/python-list/2002-September/155836.html) by Mark Jackson from 2002. + dict classes A regular class whose attributes are stored in the {attr}`object.__dict__` attribute of every single instance. This is quite wasteful especially for objects with very few data attributes and the space consumption can become significant when creating large numbers of instances. This is the type of class you get by default both with and without *attrs* (except with the next APIs {func}`attrs.define()`, [`attrs.mutable()`](attrs.mutable), and [`attrs.frozen()`](attrs.frozen)). + slotted classes A class whose instances have no {attr}`object.__dict__` attribute and [define](https://docs.python.org/3/reference/datamodel.html#slots) their attributes in a `object.__slots__` attribute instead. In *attrs*, they are created by passing `slots=True` to `@attr.s` (and are on by default in {func}`attrs.define()`, [`attrs.mutable()`](attrs.mutable), and [`attrs.frozen()`](attrs.frozen)). @@ -95,12 +97,28 @@ slotted classes - Since it's currently impossible to make a class slotted after it's been created, *attrs* has to replace your class with a new one. While it tries to do that as graciously as possible, certain metaclass features like {meth}`object.__init_subclass__` do not work with slotted classes. - - The {attr}`class.__subclasses__` attribute needs a garbage collection run (which can be manually triggered using {func}`gc.collect`), for the original class to be removed. + - The {attr}`type.__subclasses__` attribute needs a garbage collection run (which can be manually triggered using {func}`gc.collect`), for the original class to be removed. See issue [#407](https://github.com/python-attrs/attrs/issues/407) for more details. - Pickling of slotted classes will fail if you define a class with missing attributes. This situation can occur if you define an `attrs.field(init=False)` and don't set the attribute by hand before pickling. + + +field + As the project name suggests, *attrs* is all about attributes. + We especially tried to emphasize that we only care about attributes and not about the classes themselves -- because we believe the class belongs to the user. + + This explains why the traditional API uses an {func}`attr.ib` (or ``attrib``) function to define attributes and we still use the term throughout the documentation. + + However, with the emergence of {mod}`dataclasses`, [Pydantic](https://docs.pydantic.dev/latest/concepts/fields/), and other libraries, the term "field" has become a common term for a predefined attribute on a class in the Python ecosystem. + + So with our new APIs, we've embraced it too by calling the function to create them {func}`attrs.field`, and use the term "field" throughout the documentation interchangeably. + + See also {doc}`names`. + +attribute + See {term}`field`. ::: [^pypy]: On PyPy, there is no memory advantage in using slotted classes. diff --git a/tools/third_party/attrs/docs/how-does-it-work.md b/tools/third_party/attrs/docs/how-does-it-work.md index 7acc8121322b2b..70ecd45510031f 100644 --- a/tools/third_party/attrs/docs/how-does-it-work.md +++ b/tools/third_party/attrs/docs/how-does-it-work.md @@ -61,12 +61,14 @@ Depending on whether a class is a dict class or a slotted class, *attrs* uses a Once constructed, frozen instances don't differ in any way from regular ones except that you cannot change its attributes. + ### Dict Classes -Dict classes -- i.e. regular classes -- simply assign the value directly into the class' eponymous `__dict__` (and there's nothing we can do to stop the user to do the same). +Dict classes -- that is: regular classes -- simply assign the value directly into the class' eponymous `__dict__` (and there's nothing we can do to stop the user to do the same). The performance impact is negligible. + ### Slotted Classes Slotted classes are more complicated. @@ -93,26 +95,27 @@ Pick what's more important to you. ### Summary -You should avoid instantiating lots of frozen slotted classes (i.e. `@frozen`) in performance-critical code. +You should avoid instantiating lots of frozen slotted classes (meaning: `@frozen`) in performance-critical code. -Frozen dict classes have barely a performance impact, unfrozen slotted classes are even *faster* than unfrozen dict classes (i.e. regular classes). +Frozen dict classes have barely a performance impact, unfrozen slotted classes are even *faster* than unfrozen dict classes (meaning: regular classes). (how-slotted-cached_property)= -## Cached Properties on Slotted Classes. +## Cached Properties on Slotted Classes -By default, the standard library `functools.cached_property` decorator does not work on slotted classes, -because it requires a `__dict__` to store the cached value. -This could be surprising when uses *attrs*, as makes using slotted classes so easy, -so attrs will convert `functools.cached_property` decorated methods, when constructing slotted classes. +By default, the standard library {func}`functools.cached_property` decorator does not work on slotted classes, because it requires a `__dict__` to store the cached value. +This could be surprising when using *attrs*, as slotted classes are the default. +Therefore, *attrs* converts `cached_property`-decorated methods when constructing slotted classes. Getting this working is achieved by: + * Adding names to `__slots__` for the wrapped methods. * Adding a `__getattr__` method to set values on the wrapped methods. -For most users this should mean that it works transparently. +For most users, this should mean that it works transparently. -Note that the implementation does not guarantee that the wrapped method is called -only once in multi-threaded usage. This matches the implementation of `cached_property` -in python v3.12. +:::{note} +The implementation does not guarantee that the wrapped method is called only once in multi-threaded usage. +This matches the implementation of `cached_property` in Python 3.12. +::: diff --git a/tools/third_party/attrs/docs/index.md b/tools/third_party/attrs/docs/index.md index ad92b5a398257f..942a2485c43663 100644 --- a/tools/third_party/attrs/docs/index.md +++ b/tools/third_party/attrs/docs/index.md @@ -4,6 +4,28 @@ Release **{sub-ref}`release`** ([What's new?](changelog.md)) ```{include} ../README.md :start-after: 'teaser-begin -->' +:end-before: ' + + + + + + + + + +```{include} ../README.md +:start-after: 'sponsor-break-end -->' :end-before: '' -:end-before: '## Project Information' +:end-before: '### Hate Type Annotations!?' ``` @@ -60,3 +60,12 @@ All *attrs* does is: It does *nothing* dynamic at runtime, hence zero runtime overhead. It's still *your* class. Do with it as you please. + +--- + +*attrs* also is *not* a fully-fledged serialization library. +While it comes with features like converters and validators, it is meant to be a kit for building classes that you would write yourself – but with less boilerplate. +If you look for powerful-yet-unintrusive serialization and validation for your *attrs* classes, have a look at our sibling project [*cattrs*](https://catt.rs/) or our [third-party extensions](https://github.com/python-attrs/attrs/wiki/Extensions-to-attrs). + +This separation of creating classes and serializing them is a conscious design decision. +We don't think that your business model and your serialization format should be coupled. diff --git a/tools/third_party/attrs/docs/types.md b/tools/third_party/attrs/docs/types.md index 5ab7146f6e9bb2..4a64ee7f099e46 100644 --- a/tools/third_party/attrs/docs/types.md +++ b/tools/third_party/attrs/docs/types.md @@ -2,7 +2,7 @@ *attrs* comes with first-class support for type annotations for both {pep}`526` and legacy syntax. -However they will forever remain *optional*, therefore the example from the README could also be written as: +However, they will remain *optional* forever, therefore the example from the README could also be written as: ```{doctest} >>> from attrs import define, field @@ -32,7 +32,7 @@ SomeClass(a_number=42) ``` ::: -Even when going all-in on type annotations, you will need {func}`attrs.field` for some advanced features though. +Even when going all-in on type annotations, you will need {func}`attrs.field` for some advanced features, though. One of those features are the decorator-based features like defaults. It's important to remember that *attrs* doesn't do any magic behind your back. @@ -46,20 +46,18 @@ Please note that types -- regardless how added -- are *only metadata* that can b Because Python does not allow references to a class object before the class is defined, types may be defined as string literals, so-called *forward references* ({pep}`526`). -You can enable this automatically for a whole module by using `from __future__ import annotations` ({pep}`563`) as of Python 3.7. +You can enable this automatically for a whole module by using `from __future__ import annotations` ({pep}`563`). In this case *attrs* simply puts these string literals into the `type` attributes. If you need to resolve these to real types, you can call {func}`attrs.resolve_types` which will update the attribute in place. -In practice though, types show their biggest usefulness in combination with tools like [*Mypy*], [*pytype*], or [*Pyright*] that have dedicated support for *attrs* classes. +In practice though, types show their biggest usefulness in combination with tools like [Mypy], [*pytype*], or [Pyright] that have dedicated support for *attrs* classes. The addition of static types is certainly one of the most exciting features in the Python ecosystem and helps you write *correct* and *verified self-documenting* code. -If you don't know where to start, Carl Meyer gave a great talk on [*Type-checked Python in the Real World*](https://www.youtube.com/watch?v=pMgmKJyWKn8) at PyCon US 2018 that will help you to get started in no time. - ## Mypy -While having a nice syntax for type metadata is great, it's even greater that [*Mypy*] as of 0.570 ships with a dedicated *attrs* plugin which allows you to statically check your code. +While having a nice syntax for type metadata is great, it's even greater that [Mypy] ships with a dedicated *attrs* plugin which allows you to statically check your code. Imagine you add another line that tries to instantiate the defined class using `SomeClass("23")`. Mypy will catch that error for you: @@ -71,8 +69,8 @@ t.py:12: error: Argument 1 to "SomeClass" has incompatible type "str"; expected This happens *without* running your code! -And it also works with *both* Python 2-style annotation styles. -To *Mypy*, this code is equivalent to the one above: +And it also works with *both* legacy annotation styles. +To Mypy, this code is equivalent to the one above: ```python @attr.s @@ -81,31 +79,51 @@ class SomeClass: list_of_numbers = attr.ib(factory=list, type=list[int]) ``` +The approach used for `list_of_numbers` one is only a available in our [old-style API](names.md) which is why the example still uses it. + ## Pyright -*attrs* provides support for [*Pyright*] through the `dataclass_transform` / {pep}`681` specification. +*attrs* provides support for [Pyright] through the `dataclass_transform` / {pep}`681` specification. This provides static type inference for a subset of *attrs* equivalent to standard-library {mod}`dataclasses`, and requires explicit type annotations using the {func}`attrs.define` or `@attr.s(auto_attribs=True)` API. -Given the following definition, *Pyright* will generate static type signatures for `SomeClass` attribute access, `__init__`, `__eq__`, and comparison methods: +Given the following definition, Pyright will generate static type signatures for `SomeClass` attribute access, `__init__`, `__eq__`, and comparison methods: ``` -@attr.define +@attrs.define class SomeClass: a_number: int = 42 list_of_numbers: list[int] = attr.field(factory=list) ``` :::{warning} -The *Pyright* inferred types are a tiny subset of those supported by *Mypy*, including: +The Pyright inferred types are a tiny subset of those supported by Mypy, including: - The `attrs.frozen` decorator is not typed with frozen attributes, which are properly typed via `attrs.define(frozen=True)`. Your constructive feedback is welcome in both [attrs#795](https://github.com/python-attrs/attrs/issues/795) and [pyright#1782](https://github.com/microsoft/pyright/discussions/1782). -Generally speaking, the decision on improving *attrs* support in *Pyright* is entirely Microsoft's prerogative, though. +Generally speaking, the decision on improving *attrs* support in Pyright is entirely Microsoft's prerogative and they unequivocally indicated that they'll only add support for features that go through the PEP process, though. ::: -[*Mypy*]: http://mypy-lang.org -[*Pyright*]: https://github.com/microsoft/pyright + +## Class variables and constants + +If you are adding type annotations to all of your code, you might wonder how to define a class variable (as opposed to an instance variable), because a value assigned at class scope becomes a default for that attribute. +The proper way to type such a class variable, though, is with {data}`typing.ClassVar`, which indicates that the variable should only be assigned in the class (or its subclasses) and not in instances of the class. +*attrs* will skip over members annotated with {data}`typing.ClassVar`, allowing you to write a type annotation without turning the member into an attribute. +Class variables are often used for constants, though they can also be used for mutable singleton data shared across all instances of the class. + +``` +@attrs.define +class PngHeader: + SIGNATURE: typing.ClassVar[bytes] = b'\x89PNG\r\n\x1a\n' + height: int + width: int + interlaced: int = 0 + ... +``` + +[Mypy]: http://mypy-lang.org +[Pyright]: https://github.com/microsoft/pyright [*pytype*]: https://google.github.io/pytype/ diff --git a/tools/third_party/attrs/docs/why.md b/tools/third_party/attrs/docs/why.md index eeba9db5857f0c..fa63e80f51a619 100644 --- a/tools/third_party/attrs/docs/why.md +++ b/tools/third_party/attrs/docs/why.md @@ -15,10 +15,14 @@ Nevertheless, there are still reasons to prefer *attrs* over Data Classes. Whether they're relevant to *you* depends on your circumstances: - Data Classes are *intentionally* less powerful than *attrs*. - There is a long list of features that were sacrificed for the sake of simplicity and while the most obvious ones are validators, converters, {ref}`equality customization `, or {doc}`extensibility ` in general, it permeates throughout all APIs. + There is a long list of features that were sacrificed for the sake of simplicity and while the most obvious ones are validators, converters, [equality customization](custom-comparison), a solution to the [`__init_subclass__` problem](init-subclass), or {doc}`extensibility ` in general -- it permeates throughout all APIs. On the other hand, Data Classes currently do not offer any significant feature that *attrs* doesn't already have. +- We are more likely to commit crimes against nature to make things work that one would expect to work, but that are quite complicated. + + This includes stepping through generated methods using a debugger, cell rewriting to make bare `super()` calls work, or making {func}`functools.cached_property` work on slotted classes. + - *attrs* supports all mainstream Python versions including PyPy. - *attrs* doesn't force type annotations on you if you don't like them. @@ -27,6 +31,7 @@ Whether they're relevant to *you* depends on your circumstances: - While Data Classes are implementing features from *attrs* every now and then, their presence is dependent on the Python version, not the package version. For example, support for `__slots__` has only been added in Python 3.10, but it doesn’t do cell rewriting and therefore doesn’t support bare calls to `super()`. + This may or may not be fixed in later Python releases, but handling all these differences is especially painful for PyPI packages that support multiple Python versions. And of course, this includes possible implementation bugs. @@ -44,14 +49,17 @@ Basically what *attrs* was in 2015. Pydantic is first and foremost a *data validation & type coercion library*. As such, it is a capable complement to class building libraries like *attrs* (or Data Classes!) for parsing and validating untrusted data. -However, as convenient as it might be, using it for your business or data layer [is problematic in several ways](https://threeofwands.com/why-i-use-attrs-instead-of-pydantic/): +However, as convenient as it might be, using it for your business or domain layer [is problematic in several ways](https://threeofwands.com/why-i-use-attrs-instead-of-pydantic/): Is it really necessary to re-validate all your objects while reading them from a trusted database? +Should the shape of your web API really apply design pressure on your business objects and therefore business code? + In the parlance of [*Form, Command, and Model Validation*](https://verraes.net/2015/02/form-command-model-validation/), Pydantic is the right tool for *Commands*. -[*Separation of concerns*](https://en.wikipedia.org/wiki/Separation_of_concerns) feels tedious at times, but it's one of those things that you get to appreciate once you've shot your own foot often enough. +[*Separation of concerns*](https://en.wikipedia.org/wiki/Separation_of_concerns) feels tedious at times, but it's one of those things that you get to appreciate once you've shot your own foot often enough and seen the results of allowing design pressure from the edges of your system, like ORMs or web APIs. *attrs* emphatically does **not** try to be a validation library, but a toolkit to write well-behaved classes like you would write yourself. If you'd like a powerful library for structuring, unstructuring, and validating data, have a look at [*cattrs*](https://catt.rs/) which is an official member of the *attrs* family. +One of its core tenets is that it doesn't couple your classes to external factors. ## … namedtuples? @@ -106,7 +114,7 @@ Other often surprising behaviors include: # ... ``` - you end up with a class that has *two* `Point`s in its {attr}`__mro__ `: `[, , , ]`. + you end up with a class that has *two* `Point`s in its {attr}`__mro__ `: `[, , , ]`. That's not only confusing, it also has very practical consequences: for example if you create documentation that includes class hierarchies like [*Sphinx*'s autodoc](https://www.sphinx-doc.org/en/stable/usage/extensions/autodoc.html) with `show-inheritance`. @@ -272,7 +280,7 @@ is roughly ArtisanalClass(a=1, b=2) ``` -which is quite a mouthful and it doesn't even use any of *attrs*'s more advanced features like validators or default values. +That's quite a mouthful and it doesn't even use any of *attrs*'s more advanced features like validators or default values. Also: no tests whatsoever. And who will guarantee you, that you don't accidentally flip the `<` in your tenth implementation of `__gt__`? @@ -286,7 +294,7 @@ You can freely choose which features you want and disable those that you want mo ... b: int ... ... def __repr__(self): -... return "" % (self.a,) +... return f"" >>> SmartClass(1, 2) ``` diff --git a/tools/third_party/attrs/pyproject.toml b/tools/third_party/attrs/pyproject.toml index 1c72fc26d6ab76..024a1602389f01 100644 --- a/tools/third_party/attrs/pyproject.toml +++ b/tools/third_party/attrs/pyproject.toml @@ -9,35 +9,35 @@ build-backend = "hatchling.build" name = "attrs" authors = [{ name = "Hynek Schlawack", email = "hs@ox.cx" }] license = "MIT" -requires-python = ">=3.7" +license-files = ["LICENSE"] +requires-python = ">=3.8" description = "Classes Without Boilerplate" keywords = ["class", "attribute", "boilerplate"] classifiers = [ "Development Status :: 5 - Production/Stable", - "License :: OSI Approved :: MIT License", - "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Programming Language :: Python :: 3.12", + "Programming Language :: Python :: 3.13", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Typing :: Typed", ] -dependencies = ["importlib_metadata;python_version<'3.8'"] +dependencies = [] dynamic = ["version", "readme"] [project.optional-dependencies] tests-mypy = [ - 'pytest-mypy-plugins; python_implementation == "CPython" and python_version >= "3.8"', + 'pytest-mypy-plugins; platform_python_implementation == "CPython" and python_version >= "3.10"', # Since the mypy error messages keep changing, we have to keep updating this # pin. - 'mypy>=1.6; python_implementation == "CPython" and python_version >= "3.8"', + 'mypy>=1.11.1; platform_python_implementation == "CPython" and python_version >= "3.10"', ] -tests-no-zope = [ +tests = [ # For regression test to ensure cloudpickle compat doesn't break. - 'cloudpickle; python_implementation == "CPython"', + 'cloudpickle; platform_python_implementation == "CPython"', "hypothesis", "pympler", # 4.3.0 dropped last use of `convert` @@ -45,22 +45,22 @@ tests-no-zope = [ "pytest-xdist[psutil]", "attrs[tests-mypy]", ] -tests = ["attrs[tests-no-zope]", "zope.interface"] cov = [ "attrs[tests]", # Ensure coverage is new enough for `source_pkgs`. "coverage[toml]>=5.3", ] +benchmark = ["pytest-codspeed", "pytest-xdist[psutil]", "attrs[tests]"] docs = [ + "cogapp", "furo", "myst-parser", "sphinx", - "zope.interface", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", ] -dev = ["attrs[tests]", "pre-commit"] +dev = ["attrs[tests]", "pre-commit-uv"] [project.urls] Documentation = "https://www.attrs.org/" @@ -109,13 +109,48 @@ text = """ --- -[Full changelog](https://www.attrs.org/en/stable/changelog.html) +[Full changelog →](https://www.attrs.org/en/stable/changelog.html) """ # Point sponsor image URLs to versions. [[tool.hatch.metadata.hooks.fancy-pypi-readme.substitutions]] -pattern = '\/latest\/_static/sponsors' -replacement = '/$HFPR_VERSION/_static/sponsors' +pattern = 'docs\/_static\/sponsors' +replacement = 'https://www.attrs.org/en/$HFPR_VERSION/_static/sponsors' + +[[tool.sponcon.sponsors]] +title = "Variomedia AG" +url = "https://www.variomedia.de/" +img = "Variomedia.svg" + +[[tool.sponcon.sponsors]] +title = "Tidelift" +url = "https://tidelift.com/?utm_source=lifter&utm_medium=referral&utm_campaign=hynek" +img = "Tidelift.svg" + +[[tool.sponcon.sponsors]] +title = "Klaviyo" +url = "https://klaviyo.com/" +img = "Klaviyo.svg" + +[[tool.sponcon.sponsors]] +title = "Privacy Solutions" +url = "https://privacy-solutions.org/" +img = "Privacy-Solutions.svg" + +[[tool.sponcon.sponsors]] +title = "emsys renewables" +url = "https://www.emsys-renewables.com/" +img = "emsys-renewables.svg" + +[[tool.sponcon.sponsors]] +title = "FilePreviews" +url = "https://filepreviews.io/" +img = "FilePreviews.svg" + +[[tool.sponcon.sponsors]] +title = "Polar" +url = "https://polar.sh/" +img = "Polar.svg" [tool.pytest.ini_options] @@ -146,10 +181,6 @@ exclude_lines = [ ] -[tool.black] -line-length = 79 - - [tool.interrogate] omit-covered-files = true verbose = 2 @@ -163,7 +194,9 @@ toplevel = ["attr", "attrs"] [tool.ruff] src = ["src", "tests", "conftest.py", "docs"] +line-length = 79 +[tool.ruff.lint] select = ["ALL"] ignore = [ "A001", # shadowing is fine @@ -171,56 +204,49 @@ ignore = [ "A003", # shadowing is fine "ANN", # Mypy is better at this "ARG", # unused arguments are normal when implementing interfaces - "COM", # Black takes care of our commas + "C901", # we're complex software + "COM", # ruff format takes care of our commas "D", # We prefer our own docstring style. - "E501", # leave line-length enforcement to Black + "E501", # leave line-length enforcement to ruff format + "ERA001", # we need to keep around some notes "FBT", # we don't hate bool args around here "FIX", # Yes, we want XXX as a marker. + "ISC001", # conflicts with ruff format + "N", # we need more naming freedom + "PD", # we're not pandas + "PLR0912", # we're complex software "PLR0913", # yes, many arguments, but most have defaults + "PLR0915", # we're complex software "PLR2004", # numbers are sometimes fine + "PLW0603", # sometimes we need globals + "S307", # eval FTW "SLF001", # private members are accessed by friendly functions - "TCH", # TYPE_CHECKING blocks break autodocs + "TC", # TYPE_CHECKING blocks break autodocs "TD", # we don't follow other people's todo style - "C901", # we're complex software - "PLR0911", # we're complex software - "PLR0912", # we're complex software - "PLR0915", # we're complex software - "PGH001", # eval FTW - "S307", # eval FTW - "N807", # we need to create functions that start with __ - "ERA001", # we need to keep around some notes - "RSE102", # I like empty parens on raised exceptions - "N", # we need more naming freedom - "UP031", # format() is slow as molasses; % and f'' FTW. - "PD", # we're not pandas - "PLW0603", # sometimes we need globals "TRY301", # I'm sorry, but this makes not sense for us. + "UP031", # format() is slow as molasses; % and f'' FTW. ] -[tool.ruff.per-file-ignores] +[tool.ruff.lint.per-file-ignores] +"bench/**" = [ + "INP001", # Benchmarks don't have to be importable. +] "**/test_*" = [ - "ARG005", # we need stub lambdas - "S", - "SIM300", # Yoda rocks in asserts - "SIM201", # sometimes we need to check `not ==` - "SIM202", # sometimes we need to check `not ==` - "PT005", # we always add underscores and explicit names - "PT011", # broad is fine - "TRY", # exception best practices don't matter in tests - "EM101", # no need for exception msg hygiene in tests - "B904", # exception best practices don't matter in tests "B015", # pointless comparison in tests aren't pointless + "B017", # pytest.raises(Exception) is fine "B018", # pointless expressions in tests aren't pointless - "PLR0124", # pointless comparison in tests aren't pointless "DTZ", # datetime best practices don't matter in tests - "UP037", # we test some older syntaxes on purpose - "B017", # pytest.raises(Exception) is fine + "EM", # no need for exception msg hygiene in tests + "PLE0309", # hash doesn't have to return anything in tests + "PLR0124", # pointless comparison in tests aren't pointless + "PT011", # broad is fine "PT012", # sometimes we need more than a single stmt "RUF012", # we don't do ClassVar annotations in tests -] - -"conftest.py" = [ - "PT005", # we always add underscores and explicit names + "S", # security concerns don't matter in tests + "SIM201", # sometimes we need to check `not ==` + "SIM202", # sometimes we need to check `not ==` + "SIM300", # Yoda rocks in asserts + "TRY", # exception best practices don't matter in tests ] "src/*/*.pyi" = ["ALL"] # TODO @@ -229,14 +255,12 @@ ignore = [ "E741", # ambiguous variable names don't matter in type checks "B018", # useless expressions aren't useless in type checks "B015", # pointless comparison in type checks aren't pointless - "TRY301", # exception hygiene isn't important in type checks "UP037", # we test some older syntaxes on purpose ] -[tool.ruff.isort] +[tool.ruff.lint.isort] lines-between-types = 1 lines-after-imports = 2 -known-first-party = ["attr", "attrs"] [tool.towncrier] @@ -269,5 +293,6 @@ showcontent = true [tool.mypy] +pretty = true disallow_untyped_defs = true check_untyped_defs = true diff --git a/tools/third_party/attrs/src/attr/__init__.py b/tools/third_party/attrs/src/attr/__init__.py index 9226258a2d5877..5c6e0650bc4bf5 100644 --- a/tools/third_party/attrs/src/attr/__init__.py +++ b/tools/third_party/attrs/src/attr/__init__.py @@ -5,19 +5,21 @@ """ from functools import partial -from typing import Callable +from typing import Callable, Literal, Protocol from . import converters, exceptions, filters, setters, validators from ._cmp import cmp_using -from ._compat import Protocol from ._config import get_run_validators, set_run_validators -from ._funcs import asdict, assoc, astuple, evolve, has, resolve_types +from ._funcs import asdict, assoc, astuple, has, resolve_types from ._make import ( NOTHING, Attribute, + Converter, Factory, + _Nothing, attrib, attrs, + evolve, fields, fields_dict, make_class, @@ -36,11 +38,15 @@ class AttrsInstance(Protocol): pass +NothingType = Literal[_Nothing.NOTHING] + __all__ = [ + "NOTHING", "Attribute", "AttrsInstance", + "Converter", "Factory", - "NOTHING", + "NothingType", "asdict", "assoc", "astuple", @@ -79,54 +85,18 @@ def _make_getattr(mod_name: str) -> Callable: """ def __getattr__(name: str) -> str: - dunder_to_metadata = { - "__title__": "Name", - "__copyright__": "", - "__version__": "version", - "__version_info__": "version", - "__description__": "summary", - "__uri__": "", - "__url__": "", - "__author__": "", - "__email__": "", - "__license__": "license", - } - if name not in dunder_to_metadata: + if name not in ("__version__", "__version_info__"): msg = f"module {mod_name} has no attribute {name}" raise AttributeError(msg) - import sys - import warnings - - if sys.version_info < (3, 8): - from importlib_metadata import metadata - else: - from importlib.metadata import metadata - - if name not in ("__version__", "__version_info__"): - warnings.warn( - f"Accessing {mod_name}.{name} is deprecated and will be " - "removed in a future release. Use importlib.metadata directly " - "to query for attrs's packaging metadata.", - DeprecationWarning, - stacklevel=2, - ) + from importlib.metadata import metadata meta = metadata("attrs") - if name == "__license__": - return "MIT" - if name == "__copyright__": - return "Copyright (c) 2015 Hynek Schlawack" - if name in ("__uri__", "__url__"): - return meta["Project-URL"].split(" ", 1)[-1] + if name == "__version_info__": return VersionInfo._from_version_string(meta["version"]) - if name == "__author__": - return meta["Author-email"].rsplit(" ", 1)[0] - if name == "__email__": - return meta["Author-email"].rsplit("<", 1)[1][:-1] - return meta[dunder_to_metadata[name]] + return meta["version"] return __getattr__ diff --git a/tools/third_party/attrs/src/attr/__init__.pyi b/tools/third_party/attrs/src/attr/__init__.pyi index 37a208732acf77..133e50105de3ce 100644 --- a/tools/third_party/attrs/src/attr/__init__.pyi +++ b/tools/third_party/attrs/src/attr/__init__.pyi @@ -4,17 +4,12 @@ import sys from typing import ( Any, Callable, - Dict, Generic, - List, + Literal, Mapping, - Optional, Protocol, Sequence, - Tuple, - Type, TypeVar, - Union, overload, ) @@ -27,11 +22,25 @@ from . import validators as validators from ._cmp import cmp_using as cmp_using from ._typing_compat import AttrsInstance_ from ._version_info import VersionInfo +from attrs import ( + define as define, + field as field, + mutable as mutable, + frozen as frozen, + _EqOrderType, + _ValidatorType, + _ConverterType, + _ReprArgType, + _OnSetAttrType, + _OnSetAttrArgType, + _FieldTransformer, + _ValidatorArgType, +) if sys.version_info >= (3, 10): - from typing import TypeGuard + from typing import TypeGuard, TypeAlias else: - from typing_extensions import TypeGuard + from typing_extensions import TypeGuard, TypeAlias if sys.version_info >= (3, 11): from typing import dataclass_transform @@ -52,23 +61,7 @@ __copyright__: str _T = TypeVar("_T") _C = TypeVar("_C", bound=type) -_EqOrderType = Union[bool, Callable[[Any], Any]] -_ValidatorType = Callable[[Any, "Attribute[_T]", _T], Any] -_ConverterType = Callable[[Any], Any] _FilterType = Callable[["Attribute[_T]", _T], bool] -_ReprType = Callable[[Any], str] -_ReprArgType = Union[bool, _ReprType] -_OnSetAttrType = Callable[[Any, "Attribute[Any]", Any], Any] -_OnSetAttrArgType = Union[ - _OnSetAttrType, List[_OnSetAttrType], setters._NoOpType -] -_FieldTransformer = Callable[ - [type, List["Attribute[Any]"]], List["Attribute[Any]"] -] -# FIXME: in reality, if multiple validators are passed they must be in a list -# or tuple, but those are invariant and so would prevent subtypes of -# _ValidatorType from working when passed in a list or tuple. -_ValidatorArgType = Union[_ValidatorType[_T], Sequence[_ValidatorType[_T]]] # We subclass this here to keep the protocol's qualified name clean. class AttrsInstance(AttrsInstance_, Protocol): @@ -80,50 +73,70 @@ class _Nothing(enum.Enum): NOTHING = enum.auto() NOTHING = _Nothing.NOTHING +NothingType: TypeAlias = Literal[_Nothing.NOTHING] # NOTE: Factory lies about its return type to make this possible: # `x: List[int] # = Factory(list)` # Work around mypy issue #4554 in the common case by using an overload. -if sys.version_info >= (3, 8): - from typing import Literal + +@overload +def Factory(factory: Callable[[], _T]) -> _T: ... +@overload +def Factory( + factory: Callable[[Any], _T], + takes_self: Literal[True], +) -> _T: ... +@overload +def Factory( + factory: Callable[[], _T], + takes_self: Literal[False], +) -> _T: ... + +In = TypeVar("In") +Out = TypeVar("Out") + +class Converter(Generic[In, Out]): @overload - def Factory(factory: Callable[[], _T]) -> _T: ... + def __init__(self, converter: Callable[[In], Out]) -> None: ... @overload - def Factory( - factory: Callable[[Any], _T], + def __init__( + self, + converter: Callable[[In, AttrsInstance, Attribute], Out], + *, takes_self: Literal[True], - ) -> _T: ... + takes_field: Literal[True], + ) -> None: ... @overload - def Factory( - factory: Callable[[], _T], - takes_self: Literal[False], - ) -> _T: ... - -else: + def __init__( + self, + converter: Callable[[In, Attribute], Out], + *, + takes_field: Literal[True], + ) -> None: ... @overload - def Factory(factory: Callable[[], _T]) -> _T: ... - @overload - def Factory( - factory: Union[Callable[[Any], _T], Callable[[], _T]], - takes_self: bool = ..., - ) -> _T: ... + def __init__( + self, + converter: Callable[[In, AttrsInstance], Out], + *, + takes_self: Literal[True], + ) -> None: ... class Attribute(Generic[_T]): name: str - default: Optional[_T] - validator: Optional[_ValidatorType[_T]] + default: _T | None + validator: _ValidatorType[_T] | None repr: _ReprArgType cmp: _EqOrderType eq: _EqOrderType order: _EqOrderType - hash: Optional[bool] + hash: bool | None init: bool - converter: Optional[_ConverterType] - metadata: Dict[Any, Any] - type: Optional[Type[_T]] + converter: Converter | None + metadata: dict[Any, Any] + type: type[_T] | None kw_only: bool on_setattr: _OnSetAttrType - alias: Optional[str] + alias: str | None def evolve(self, **changes: Any) -> "Attribute[Any]": ... @@ -156,18 +169,18 @@ def attrib( default: None = ..., validator: None = ..., repr: _ReprArgType = ..., - cmp: Optional[_EqOrderType] = ..., - hash: Optional[bool] = ..., + cmp: _EqOrderType | None = ..., + hash: bool | None = ..., init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., + metadata: Mapping[Any, Any] | None = ..., type: None = ..., converter: None = ..., factory: None = ..., kw_only: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., ) -> Any: ... # This form catches an explicit None or no default and infers the type from the @@ -175,149 +188,79 @@ def attrib( @overload def attrib( default: None = ..., - validator: Optional[_ValidatorArgType[_T]] = ..., + validator: _ValidatorArgType[_T] | None = ..., repr: _ReprArgType = ..., - cmp: Optional[_EqOrderType] = ..., - hash: Optional[bool] = ..., + cmp: _EqOrderType | None = ..., + hash: bool | None = ..., init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., - type: Optional[Type[_T]] = ..., - converter: Optional[_ConverterType] = ..., - factory: Optional[Callable[[], _T]] = ..., + metadata: Mapping[Any, Any] | None = ..., + type: type[_T] | None = ..., + converter: _ConverterType + | list[_ConverterType] + | tuple[_ConverterType] + | None = ..., + factory: Callable[[], _T] | None = ..., kw_only: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., ) -> _T: ... # This form catches an explicit default argument. @overload def attrib( default: _T, - validator: Optional[_ValidatorArgType[_T]] = ..., + validator: _ValidatorArgType[_T] | None = ..., repr: _ReprArgType = ..., - cmp: Optional[_EqOrderType] = ..., - hash: Optional[bool] = ..., + cmp: _EqOrderType | None = ..., + hash: bool | None = ..., init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., - type: Optional[Type[_T]] = ..., - converter: Optional[_ConverterType] = ..., - factory: Optional[Callable[[], _T]] = ..., + metadata: Mapping[Any, Any] | None = ..., + type: type[_T] | None = ..., + converter: _ConverterType + | list[_ConverterType] + | tuple[_ConverterType] + | None = ..., + factory: Callable[[], _T] | None = ..., kw_only: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., ) -> _T: ... # This form covers type=non-Type: e.g. forward references (str), Any @overload def attrib( - default: Optional[_T] = ..., - validator: Optional[_ValidatorArgType[_T]] = ..., + default: _T | None = ..., + validator: _ValidatorArgType[_T] | None = ..., repr: _ReprArgType = ..., - cmp: Optional[_EqOrderType] = ..., - hash: Optional[bool] = ..., + cmp: _EqOrderType | None = ..., + hash: bool | None = ..., init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., + metadata: Mapping[Any, Any] | None = ..., type: object = ..., - converter: Optional[_ConverterType] = ..., - factory: Optional[Callable[[], _T]] = ..., - kw_only: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., -) -> Any: ... -@overload -def field( - *, - default: None = ..., - validator: None = ..., - repr: _ReprArgType = ..., - hash: Optional[bool] = ..., - init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., - converter: None = ..., - factory: None = ..., - kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., - type: Optional[type] = ..., -) -> Any: ... - -# This form catches an explicit None or no default and infers the type from the -# other arguments. -@overload -def field( - *, - default: None = ..., - validator: Optional[_ValidatorArgType[_T]] = ..., - repr: _ReprArgType = ..., - hash: Optional[bool] = ..., - init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., - converter: Optional[_ConverterType] = ..., - factory: Optional[Callable[[], _T]] = ..., - kw_only: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., - type: Optional[type] = ..., -) -> _T: ... - -# This form catches an explicit default argument. -@overload -def field( - *, - default: _T, - validator: Optional[_ValidatorArgType[_T]] = ..., - repr: _ReprArgType = ..., - hash: Optional[bool] = ..., - init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., - converter: Optional[_ConverterType] = ..., - factory: Optional[Callable[[], _T]] = ..., + converter: _ConverterType + | list[_ConverterType] + | tuple[_ConverterType] + | None = ..., + factory: Callable[[], _T] | None = ..., kw_only: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., - type: Optional[type] = ..., -) -> _T: ... - -# This form covers type=non-Type: e.g. forward references (str), Any -@overload -def field( - *, - default: Optional[_T] = ..., - validator: Optional[_ValidatorArgType[_T]] = ..., - repr: _ReprArgType = ..., - hash: Optional[bool] = ..., - init: bool = ..., - metadata: Optional[Mapping[Any, Any]] = ..., - converter: Optional[_ConverterType] = ..., - factory: Optional[Callable[[], _T]] = ..., - kw_only: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - alias: Optional[str] = ..., - type: Optional[type] = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., ) -> Any: ... @overload @dataclass_transform(order_default=True, field_specifiers=(attrib, field)) def attrs( maybe_cls: _C, - these: Optional[Dict[str, Any]] = ..., - repr_ns: Optional[str] = ..., + these: dict[str, Any] | None = ..., + repr_ns: str | None = ..., repr: bool = ..., - cmp: Optional[_EqOrderType] = ..., - hash: Optional[bool] = ..., + cmp: _EqOrderType | None = ..., + hash: bool | None = ..., init: bool = ..., slots: bool = ..., frozen: bool = ..., @@ -327,25 +270,25 @@ def attrs( kw_only: bool = ..., cache_hash: bool = ..., auto_exc: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., auto_detect: bool = ..., collect_by_mro: bool = ..., - getstate_setstate: Optional[bool] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - field_transformer: Optional[_FieldTransformer] = ..., + getstate_setstate: bool | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + field_transformer: _FieldTransformer | None = ..., match_args: bool = ..., - unsafe_hash: Optional[bool] = ..., + unsafe_hash: bool | None = ..., ) -> _C: ... @overload @dataclass_transform(order_default=True, field_specifiers=(attrib, field)) def attrs( maybe_cls: None = ..., - these: Optional[Dict[str, Any]] = ..., - repr_ns: Optional[str] = ..., + these: dict[str, Any] | None = ..., + repr_ns: str | None = ..., repr: bool = ..., - cmp: Optional[_EqOrderType] = ..., - hash: Optional[bool] = ..., + cmp: _EqOrderType | None = ..., + hash: bool | None = ..., init: bool = ..., slots: bool = ..., frozen: bool = ..., @@ -355,131 +298,24 @@ def attrs( kw_only: bool = ..., cache_hash: bool = ..., auto_exc: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., auto_detect: bool = ..., collect_by_mro: bool = ..., - getstate_setstate: Optional[bool] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - field_transformer: Optional[_FieldTransformer] = ..., - match_args: bool = ..., - unsafe_hash: Optional[bool] = ..., -) -> Callable[[_C], _C]: ... -@overload -@dataclass_transform(field_specifiers=(attrib, field)) -def define( - maybe_cls: _C, - *, - these: Optional[Dict[str, Any]] = ..., - repr: bool = ..., - unsafe_hash: Optional[bool] = ..., - hash: Optional[bool] = ..., - init: bool = ..., - slots: bool = ..., - frozen: bool = ..., - weakref_slot: bool = ..., - str: bool = ..., - auto_attribs: bool = ..., - kw_only: bool = ..., - cache_hash: bool = ..., - auto_exc: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., - auto_detect: bool = ..., - getstate_setstate: Optional[bool] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - field_transformer: Optional[_FieldTransformer] = ..., - match_args: bool = ..., -) -> _C: ... -@overload -@dataclass_transform(field_specifiers=(attrib, field)) -def define( - maybe_cls: None = ..., - *, - these: Optional[Dict[str, Any]] = ..., - repr: bool = ..., - unsafe_hash: Optional[bool] = ..., - hash: Optional[bool] = ..., - init: bool = ..., - slots: bool = ..., - frozen: bool = ..., - weakref_slot: bool = ..., - str: bool = ..., - auto_attribs: bool = ..., - kw_only: bool = ..., - cache_hash: bool = ..., - auto_exc: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., - auto_detect: bool = ..., - getstate_setstate: Optional[bool] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - field_transformer: Optional[_FieldTransformer] = ..., - match_args: bool = ..., -) -> Callable[[_C], _C]: ... - -mutable = define - -@overload -@dataclass_transform(frozen_default=True, field_specifiers=(attrib, field)) -def frozen( - maybe_cls: _C, - *, - these: Optional[Dict[str, Any]] = ..., - repr: bool = ..., - unsafe_hash: Optional[bool] = ..., - hash: Optional[bool] = ..., - init: bool = ..., - slots: bool = ..., - frozen: bool = ..., - weakref_slot: bool = ..., - str: bool = ..., - auto_attribs: bool = ..., - kw_only: bool = ..., - cache_hash: bool = ..., - auto_exc: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., - auto_detect: bool = ..., - getstate_setstate: Optional[bool] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - field_transformer: Optional[_FieldTransformer] = ..., - match_args: bool = ..., -) -> _C: ... -@overload -@dataclass_transform(frozen_default=True, field_specifiers=(attrib, field)) -def frozen( - maybe_cls: None = ..., - *, - these: Optional[Dict[str, Any]] = ..., - repr: bool = ..., - unsafe_hash: Optional[bool] = ..., - hash: Optional[bool] = ..., - init: bool = ..., - slots: bool = ..., - frozen: bool = ..., - weakref_slot: bool = ..., - str: bool = ..., - auto_attribs: bool = ..., - kw_only: bool = ..., - cache_hash: bool = ..., - auto_exc: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., - auto_detect: bool = ..., - getstate_setstate: Optional[bool] = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - field_transformer: Optional[_FieldTransformer] = ..., + getstate_setstate: bool | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + field_transformer: _FieldTransformer | None = ..., match_args: bool = ..., + unsafe_hash: bool | None = ..., ) -> Callable[[_C], _C]: ... -def fields(cls: Type[AttrsInstance]) -> Any: ... -def fields_dict(cls: Type[AttrsInstance]) -> Dict[str, Attribute[Any]]: ... +def fields(cls: type[AttrsInstance]) -> Any: ... +def fields_dict(cls: type[AttrsInstance]) -> dict[str, Attribute[Any]]: ... def validate(inst: AttrsInstance) -> None: ... def resolve_types( cls: _A, - globalns: Optional[Dict[str, Any]] = ..., - localns: Optional[Dict[str, Any]] = ..., - attribs: Optional[List[Attribute[Any]]] = ..., + globalns: dict[str, Any] | None = ..., + localns: dict[str, Any] | None = ..., + attribs: list[Attribute[Any]] | None = ..., include_extras: bool = ..., ) -> _A: ... @@ -488,13 +324,13 @@ def resolve_types( # [attr.ib()])` is valid def make_class( name: str, - attrs: Union[List[str], Tuple[str, ...], Dict[str, Any]], - bases: Tuple[type, ...] = ..., - class_body: Optional[Dict[str, Any]] = ..., - repr_ns: Optional[str] = ..., + attrs: list[str] | tuple[str, ...] | dict[str, Any], + bases: tuple[type, ...] = ..., + class_body: dict[str, Any] | None = ..., + repr_ns: str | None = ..., repr: bool = ..., - cmp: Optional[_EqOrderType] = ..., - hash: Optional[bool] = ..., + cmp: _EqOrderType | None = ..., + hash: bool | None = ..., init: bool = ..., slots: bool = ..., frozen: bool = ..., @@ -504,11 +340,11 @@ def make_class( kw_only: bool = ..., cache_hash: bool = ..., auto_exc: bool = ..., - eq: Optional[_EqOrderType] = ..., - order: Optional[_EqOrderType] = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., collect_by_mro: bool = ..., - on_setattr: Optional[_OnSetAttrArgType] = ..., - field_transformer: Optional[_FieldTransformer] = ..., + on_setattr: _OnSetAttrArgType | None = ..., + field_transformer: _FieldTransformer | None = ..., ) -> type: ... # _funcs -- @@ -522,24 +358,22 @@ def make_class( def asdict( inst: AttrsInstance, recurse: bool = ..., - filter: Optional[_FilterType[Any]] = ..., - dict_factory: Type[Mapping[Any, Any]] = ..., + filter: _FilterType[Any] | None = ..., + dict_factory: type[Mapping[Any, Any]] = ..., retain_collection_types: bool = ..., - value_serializer: Optional[ - Callable[[type, Attribute[Any], Any], Any] - ] = ..., - tuple_keys: Optional[bool] = ..., -) -> Dict[str, Any]: ... + value_serializer: Callable[[type, Attribute[Any], Any], Any] | None = ..., + tuple_keys: bool | None = ..., +) -> dict[str, Any]: ... # TODO: add support for returning NamedTuple from the mypy plugin def astuple( inst: AttrsInstance, recurse: bool = ..., - filter: Optional[_FilterType[Any]] = ..., - tuple_factory: Type[Sequence[Any]] = ..., + filter: _FilterType[Any] | None = ..., + tuple_factory: type[Sequence[Any]] = ..., retain_collection_types: bool = ..., -) -> Tuple[Any, ...]: ... -def has(cls: type) -> TypeGuard[Type[AttrsInstance]]: ... +) -> tuple[Any, ...]: ... +def has(cls: type) -> TypeGuard[type[AttrsInstance]]: ... def assoc(inst: _T, **changes: Any) -> _T: ... def evolve(inst: _T, **changes: Any) -> _T: ... diff --git a/tools/third_party/attrs/src/attr/_cmp.py b/tools/third_party/attrs/src/attr/_cmp.py index a4a35e08fc9d9b..09bab491f83ef4 100644 --- a/tools/third_party/attrs/src/attr/_cmp.py +++ b/tools/third_party/attrs/src/attr/_cmp.py @@ -4,7 +4,7 @@ import functools import types -from ._make import _make_ne +from ._make import __ne__ _operation_names = {"eq": "==", "lt": "<", "le": "<=", "gt": ">", "ge": ">="} @@ -26,21 +26,31 @@ def cmp_using( The resulting class will have a full set of ordering methods if at least one of ``{lt, le, gt, ge}`` and ``eq`` are provided. - :param Optional[callable] eq: `callable` used to evaluate equality of two - objects. - :param Optional[callable] lt: `callable` used to evaluate whether one - object is less than another object. - :param Optional[callable] le: `callable` used to evaluate whether one - object is less than or equal to another object. - :param Optional[callable] gt: `callable` used to evaluate whether one - object is greater than another object. - :param Optional[callable] ge: `callable` used to evaluate whether one - object is greater than or equal to another object. + Args: + eq (typing.Callable | None): + Callable used to evaluate equality of two objects. - :param bool require_same_type: When `True`, equality and ordering methods - will return `NotImplemented` if objects are not of the same type. + lt (typing.Callable | None): + Callable used to evaluate whether one object is less than another + object. - :param Optional[str] class_name: Name of class. Defaults to 'Comparable'. + le (typing.Callable | None): + Callable used to evaluate whether one object is less than or equal + to another object. + + gt (typing.Callable | None): + Callable used to evaluate whether one object is greater than + another object. + + ge (typing.Callable | None): + Callable used to evaluate whether one object is greater than or + equal to another object. + + require_same_type (bool): + When `True`, equality and ordering methods will return + `NotImplemented` if objects are not of the same type. + + class_name (str | None): Name of class. Defaults to "Comparable". See `comparison` for more details. @@ -61,7 +71,7 @@ def cmp_using( if eq is not None: has_eq_function = True body["__eq__"] = _make_operator("eq", eq) - body["__ne__"] = _make_ne() + body["__ne__"] = __ne__ if lt is not None: num_order_functions += 1 diff --git a/tools/third_party/attrs/src/attr/_cmp.pyi b/tools/third_party/attrs/src/attr/_cmp.pyi index f3dcdc1a754146..cc7893b04520af 100644 --- a/tools/third_party/attrs/src/attr/_cmp.pyi +++ b/tools/third_party/attrs/src/attr/_cmp.pyi @@ -1,13 +1,13 @@ -from typing import Any, Callable, Optional, Type +from typing import Any, Callable _CompareWithType = Callable[[Any, Any], bool] def cmp_using( - eq: Optional[_CompareWithType] = ..., - lt: Optional[_CompareWithType] = ..., - le: Optional[_CompareWithType] = ..., - gt: Optional[_CompareWithType] = ..., - ge: Optional[_CompareWithType] = ..., + eq: _CompareWithType | None = ..., + lt: _CompareWithType | None = ..., + le: _CompareWithType | None = ..., + gt: _CompareWithType | None = ..., + ge: _CompareWithType | None = ..., require_same_type: bool = ..., class_name: str = ..., -) -> Type: ... +) -> type: ... diff --git a/tools/third_party/attrs/src/attr/_compat.py b/tools/third_party/attrs/src/attr/_compat.py index 46b05ca453773d..22fcd78387b7b3 100644 --- a/tools/third_party/attrs/src/attr/_compat.py +++ b/tools/third_party/attrs/src/attr/_compat.py @@ -10,19 +10,26 @@ PYPY = platform.python_implementation() == "PyPy" -PY_3_8_PLUS = sys.version_info[:2] >= (3, 8) PY_3_9_PLUS = sys.version_info[:2] >= (3, 9) -PY310 = sys.version_info[:2] >= (3, 10) +PY_3_10_PLUS = sys.version_info[:2] >= (3, 10) +PY_3_11_PLUS = sys.version_info[:2] >= (3, 11) PY_3_12_PLUS = sys.version_info[:2] >= (3, 12) +PY_3_13_PLUS = sys.version_info[:2] >= (3, 13) +PY_3_14_PLUS = sys.version_info[:2] >= (3, 14) -if sys.version_info < (3, 8): - try: - from typing_extensions import Protocol - except ImportError: # pragma: no cover - Protocol = object +if PY_3_14_PLUS: # pragma: no cover + import annotationlib + + _get_annotations = annotationlib.get_annotations + else: - from typing import Protocol # noqa: F401 + + def _get_annotations(cls): + """ + Get annotations for *cls*. + """ + return cls.__dict__.get("__annotations__", {}) class _AnnotationExtractor: diff --git a/tools/third_party/attrs/src/attr/_config.py b/tools/third_party/attrs/src/attr/_config.py index 9c245b1461abd5..4b257726fb1e8b 100644 --- a/tools/third_party/attrs/src/attr/_config.py +++ b/tools/third_party/attrs/src/attr/_config.py @@ -1,6 +1,6 @@ # SPDX-License-Identifier: MIT -__all__ = ["set_run_validators", "get_run_validators"] +__all__ = ["get_run_validators", "set_run_validators"] _run_validators = True diff --git a/tools/third_party/attrs/src/attr/_funcs.py b/tools/third_party/attrs/src/attr/_funcs.py index a888991d98fdac..c39fb8aa5a9426 100644 --- a/tools/third_party/attrs/src/attr/_funcs.py +++ b/tools/third_party/attrs/src/attr/_funcs.py @@ -4,7 +4,7 @@ import copy from ._compat import PY_3_9_PLUS, get_generic_base -from ._make import NOTHING, _obj_setattr, fields +from ._make import _OBJ_SETATTR, NOTHING, fields from .exceptions import AttrsAttributeNotFoundError @@ -21,34 +21,44 @@ def asdict( Optionally recurse into other *attrs*-decorated classes. - :param inst: Instance of an *attrs*-decorated class. - :param bool recurse: Recurse into classes that are also - *attrs*-decorated. - :param callable filter: A callable whose return code determines whether an - attribute or element is included (``True``) or dropped (``False``). Is - called with the `attrs.Attribute` as the first argument and the - value as the second argument. - :param callable dict_factory: A callable to produce dictionaries from. For - example, to produce ordered dictionaries instead of normal Python - dictionaries, pass in ``collections.OrderedDict``. - :param bool retain_collection_types: Do not convert to ``list`` when - encountering an attribute whose type is ``tuple`` or ``set``. Only - meaningful if ``recurse`` is ``True``. - :param Optional[callable] value_serializer: A hook that is called for every - attribute or dict key/value. It receives the current instance, field - and value and must return the (updated) value. The hook is run *after* - the optional *filter* has been applied. - - :rtype: return type of *dict_factory* - - :raise attrs.exceptions.NotAnAttrsClassError: If *cls* is not an *attrs* - class. + Args: + inst: Instance of an *attrs*-decorated class. + + recurse (bool): Recurse into classes that are also *attrs*-decorated. + + filter (~typing.Callable): + A callable whose return code determines whether an attribute or + element is included (`True`) or dropped (`False`). Is called with + the `attrs.Attribute` as the first argument and the value as the + second argument. + + dict_factory (~typing.Callable): + A callable to produce dictionaries from. For example, to produce + ordered dictionaries instead of normal Python dictionaries, pass in + ``collections.OrderedDict``. + + retain_collection_types (bool): + Do not convert to `list` when encountering an attribute whose type + is `tuple` or `set`. Only meaningful if *recurse* is `True`. + + value_serializer (typing.Callable | None): + A hook that is called for every attribute or dict key/value. It + receives the current instance, field and value and must return the + (updated) value. The hook is run *after* the optional *filter* has + been applied. + + Returns: + Return type of *dict_factory*. + + Raises: + attrs.exceptions.NotAnAttrsClassError: + If *cls* is not an *attrs* class. .. versionadded:: 16.0.0 *dict_factory* .. versionadded:: 16.1.0 *retain_collection_types* .. versionadded:: 20.3.0 *value_serializer* - .. versionadded:: 21.3.0 If a dict has a collection for a key, it is - serialized as a tuple. + .. versionadded:: 21.3.0 + If a dict has a collection for a key, it is serialized as a tuple. """ attrs = fields(inst.__class__) rv = dict_factory() @@ -206,24 +216,33 @@ def astuple( Optionally recurse into other *attrs*-decorated classes. - :param inst: Instance of an *attrs*-decorated class. - :param bool recurse: Recurse into classes that are also - *attrs*-decorated. - :param callable filter: A callable whose return code determines whether an - attribute or element is included (``True``) or dropped (``False``). Is - called with the `attrs.Attribute` as the first argument and the - value as the second argument. - :param callable tuple_factory: A callable to produce tuples from. For - example, to produce lists instead of tuples. - :param bool retain_collection_types: Do not convert to ``list`` - or ``dict`` when encountering an attribute which type is - ``tuple``, ``dict`` or ``set``. Only meaningful if ``recurse`` is - ``True``. - - :rtype: return type of *tuple_factory* - - :raise attrs.exceptions.NotAnAttrsClassError: If *cls* is not an *attrs* - class. + Args: + inst: Instance of an *attrs*-decorated class. + + recurse (bool): + Recurse into classes that are also *attrs*-decorated. + + filter (~typing.Callable): + A callable whose return code determines whether an attribute or + element is included (`True`) or dropped (`False`). Is called with + the `attrs.Attribute` as the first argument and the value as the + second argument. + + tuple_factory (~typing.Callable): + A callable to produce tuples from. For example, to produce lists + instead of tuples. + + retain_collection_types (bool): + Do not convert to `list` or `dict` when encountering an attribute + which type is `tuple`, `dict` or `set`. Only meaningful if + *recurse* is `True`. + + Returns: + Return type of *tuple_factory* + + Raises: + attrs.exceptions.NotAnAttrsClassError: + If *cls* is not an *attrs* class. .. versionadded:: 16.2.0 """ @@ -248,15 +267,17 @@ def astuple( elif isinstance(v, (tuple, list, set, frozenset)): cf = v.__class__ if retain is True else list items = [ - astuple( - j, - recurse=True, - filter=filter, - tuple_factory=tuple_factory, - retain_collection_types=retain, + ( + astuple( + j, + recurse=True, + filter=filter, + tuple_factory=tuple_factory, + retain_collection_types=retain, + ) + if has(j.__class__) + else j ) - if has(j.__class__) - else j for j in v ] try: @@ -272,20 +293,24 @@ def astuple( rv.append( df( ( - astuple( - kk, - tuple_factory=tuple_factory, - retain_collection_types=retain, - ) - if has(kk.__class__) - else kk, - astuple( - vv, - tuple_factory=tuple_factory, - retain_collection_types=retain, - ) - if has(vv.__class__) - else vv, + ( + astuple( + kk, + tuple_factory=tuple_factory, + retain_collection_types=retain, + ) + if has(kk.__class__) + else kk + ), + ( + astuple( + vv, + tuple_factory=tuple_factory, + retain_collection_types=retain, + ) + if has(vv.__class__) + else vv + ), ) for kk, vv in v.items() ) @@ -302,10 +327,14 @@ def has(cls): """ Check whether *cls* is a class with *attrs* attributes. - :param type cls: Class to introspect. - :raise TypeError: If *cls* is not a class. + Args: + cls (type): Class to introspect. + + Raises: + TypeError: If *cls* is not a class. - :rtype: bool + Returns: + bool: """ attrs = getattr(cls, "__attrs_attrs__", None) if attrs is not None: @@ -334,20 +363,25 @@ def assoc(inst, **changes): .. _`edge cases`: https://github.com/python-attrs/attrs/issues/251 - :param inst: Instance of a class with *attrs* attributes. - :param changes: Keyword changes in the new copy. + Args: + inst: Instance of a class with *attrs* attributes. - :return: A copy of inst with *changes* incorporated. + changes: Keyword changes in the new copy. - :raise attrs.exceptions.AttrsAttributeNotFoundError: If *attr_name* - couldn't be found on *cls*. - :raise attrs.exceptions.NotAnAttrsClassError: If *cls* is not an *attrs* - class. + Returns: + A copy of inst with *changes* incorporated. + + Raises: + attrs.exceptions.AttrsAttributeNotFoundError: + If *attr_name* couldn't be found on *cls*. + + attrs.exceptions.NotAnAttrsClassError: + If *cls* is not an *attrs* class. .. deprecated:: 17.1.0 - Use `attrs.evolve` instead if you can. - This function will not be removed du to the slightly different approach - compared to `attrs.evolve`. + Use `attrs.evolve` instead if you can. This function will not be + removed du to the slightly different approach compared to + `attrs.evolve`, though. """ new = copy.copy(inst) attrs = fields(inst.__class__) @@ -356,109 +390,60 @@ def assoc(inst, **changes): if a is NOTHING: msg = f"{k} is not an attrs attribute on {new.__class__}." raise AttrsAttributeNotFoundError(msg) - _obj_setattr(new, k, v) + _OBJ_SETATTR(new, k, v) return new -def evolve(*args, **changes): +def resolve_types( + cls, globalns=None, localns=None, attribs=None, include_extras=True +): """ - Create a new instance, based on the first positional argument with - *changes* applied. + Resolve any strings and forward annotations in type annotations. - :param inst: Instance of a class with *attrs* attributes. - :param changes: Keyword changes in the new copy. + This is only required if you need concrete types in :class:`Attribute`'s + *type* field. In other words, you don't need to resolve your types if you + only use them for static type checking. - :return: A copy of inst with *changes* incorporated. + With no arguments, names will be looked up in the module in which the class + was created. If this is not what you want, for example, if the name only + exists inside a method, you may pass *globalns* or *localns* to specify + other dictionaries in which to look up these names. See the docs of + `typing.get_type_hints` for more details. - :raise TypeError: If *attr_name* couldn't be found in the class - ``__init__``. - :raise attrs.exceptions.NotAnAttrsClassError: If *cls* is not an *attrs* - class. + Args: + cls (type): Class to resolve. - .. versionadded:: 17.1.0 - .. deprecated:: 23.1.0 - It is now deprecated to pass the instance using the keyword argument - *inst*. It will raise a warning until at least April 2024, after which - it will become an error. Always pass the instance as a positional - argument. - """ - # Try to get instance by positional argument first. - # Use changes otherwise and warn it'll break. - if args: - try: - (inst,) = args - except ValueError: - msg = f"evolve() takes 1 positional argument, but {len(args)} were given" - raise TypeError(msg) from None - else: - try: - inst = changes.pop("inst") - except KeyError: - msg = "evolve() missing 1 required positional argument: 'inst'" - raise TypeError(msg) from None - - import warnings - - warnings.warn( - "Passing the instance per keyword argument is deprecated and " - "will stop working in, or after, April 2024.", - DeprecationWarning, - stacklevel=2, - ) + globalns (dict | None): Dictionary containing global variables. - cls = inst.__class__ - attrs = fields(cls) - for a in attrs: - if not a.init: - continue - attr_name = a.name # To deal with private attributes. - init_name = a.alias - if init_name not in changes: - changes[init_name] = getattr(inst, attr_name) + localns (dict | None): Dictionary containing local variables. - return cls(**changes) + attribs (list | None): + List of attribs for the given class. This is necessary when calling + from inside a ``field_transformer`` since *cls* is not an *attrs* + class yet. + include_extras (bool): + Resolve more accurately, if possible. Pass ``include_extras`` to + ``typing.get_hints``, if supported by the typing module. On + supported Python versions (3.9+), this resolves the types more + accurately. -def resolve_types( - cls, globalns=None, localns=None, attribs=None, include_extras=True -): - """ - Resolve any strings and forward annotations in type annotations. + Raises: + TypeError: If *cls* is not a class. - This is only required if you need concrete types in `Attribute`'s *type* - field. In other words, you don't need to resolve your types if you only - use them for static type checking. + attrs.exceptions.NotAnAttrsClassError: + If *cls* is not an *attrs* class and you didn't pass any attribs. - With no arguments, names will be looked up in the module in which the class - was created. If this is not what you want, e.g. if the name only exists - inside a method, you may pass *globalns* or *localns* to specify other - dictionaries in which to look up these names. See the docs of - `typing.get_type_hints` for more details. + NameError: If types cannot be resolved because of missing variables. - :param type cls: Class to resolve. - :param Optional[dict] globalns: Dictionary containing global variables. - :param Optional[dict] localns: Dictionary containing local variables. - :param Optional[list] attribs: List of attribs for the given class. - This is necessary when calling from inside a ``field_transformer`` - since *cls* is not an *attrs* class yet. - :param bool include_extras: Resolve more accurately, if possible. - Pass ``include_extras`` to ``typing.get_hints``, if supported by the - typing module. On supported Python versions (3.9+), this resolves the - types more accurately. - - :raise TypeError: If *cls* is not a class. - :raise attrs.exceptions.NotAnAttrsClassError: If *cls* is not an *attrs* - class and you didn't pass any attribs. - :raise NameError: If types cannot be resolved because of missing variables. - - :returns: *cls* so you can use this function also as a class decorator. - Please note that you have to apply it **after** `attrs.define`. That - means the decorator has to come in the line **before** `attrs.define`. + Returns: + *cls* so you can use this function also as a class decorator. Please + note that you have to apply it **after** `attrs.define`. That means the + decorator has to come in the line **before** `attrs.define`. .. versionadded:: 20.1.0 .. versionadded:: 21.1.0 *attribs* .. versionadded:: 23.1.0 *include_extras* - """ # Since calling get_type_hints is expensive we cache whether we've # done it already. @@ -474,7 +459,7 @@ class and you didn't pass any attribs. for field in fields(cls) if attribs is None else attribs: if field.name in hints: # Since fields have been frozen we must work around it. - _obj_setattr(field, "type", hints[field.name]) + _OBJ_SETATTR(field, "type", hints[field.name]) # We store the class we resolved so that subclasses know they haven't # been resolved. cls.__attrs_types_resolved__ = cls diff --git a/tools/third_party/attrs/src/attr/_make.py b/tools/third_party/attrs/src/attr/_make.py index 10b4eca779621c..e84d9792a744f3 100644 --- a/tools/third_party/attrs/src/attr/_make.py +++ b/tools/third_party/attrs/src/attr/_make.py @@ -1,25 +1,31 @@ # SPDX-License-Identifier: MIT +from __future__ import annotations + +import abc import contextlib import copy import enum -import functools import inspect import itertools import linecache import sys import types -import typing +import unicodedata -from operator import itemgetter +from collections.abc import Callable, Mapping +from functools import cached_property +from typing import Any, NamedTuple, TypeVar # We need to import _compat itself in addition to the _compat members to avoid # having the thread-local in the globals here. from . import _compat, _config, setters from ._compat import ( - PY310, - PY_3_8_PLUS, + PY_3_10_PLUS, + PY_3_11_PLUS, + PY_3_13_PLUS, _AnnotationExtractor, + _get_annotations, get_generic_base, ) from .exceptions import ( @@ -31,10 +37,9 @@ # This is used at least twice, so cache it here. -_obj_setattr = object.__setattr__ -_init_converter_pat = "__attr_converter_%s" -_init_factory_pat = "__attr_factory_%s" -_classvar_prefixes = ( +_OBJ_SETATTR = object.__setattr__ +_INIT_FACTORY_PAT = "__attr_factory_%s" +_CLASSVAR_PREFIXES = ( "typing.ClassVar", "t.ClassVar", "ClassVar", @@ -43,19 +48,19 @@ # we don't use a double-underscore prefix because that triggers # name mangling when trying to create a slot for the field # (when slots=True) -_hash_cache_field = "_attrs_cached_hash" +_HASH_CACHE_FIELD = "_attrs_cached_hash" -_empty_metadata_singleton = types.MappingProxyType({}) +_EMPTY_METADATA_SINGLETON = types.MappingProxyType({}) # Unique object for unequivocal getattr() defaults. -_sentinel = object() +_SENTINEL = object() -_ng_default_on_setattr = setters.pipe(setters.convert, setters.validate) +_DEFAULT_ON_SETATTR = setters.pipe(setters.convert, setters.validate) class _Nothing(enum.Enum): """ - Sentinel to indicate the lack of a value when ``None`` is ambiguous. + Sentinel to indicate the lack of a value when `None` is ambiguous. If extending attrs, you can use ``typing.Literal[NOTHING]`` to show that a value may be ``NOTHING``. @@ -75,7 +80,9 @@ def __bool__(self): NOTHING = _Nothing.NOTHING """ -Sentinel to indicate the lack of a value when ``None`` is ambiguous. +Sentinel to indicate the lack of a value when `None` is ambiguous. + +When using in 3rd party code, use `attrs.NothingType` for type annotations. """ @@ -84,7 +91,7 @@ class _CacheHashWrapper(int): An integer subclass that pickles / copies as None This is used for non-slots classes with ``cache_hash=True``, to avoid - serializing a potentially (even likely) invalid hash value. Since ``None`` + serializing a potentially (even likely) invalid hash value. Since `None` is the default value for uncalculated hashes, whenever this is copied, the copy's value for the hash should automatically reset. @@ -113,137 +120,29 @@ def attrib( alias=None, ): """ - Create a new attribute on a class. - - .. warning:: - - Does *not* do anything unless the class is also decorated with `attr.s` - / `attrs.define` / and so on! - - Please consider using `attrs.field` in new code (``attr.ib`` will *never* - go away, though). - - :param default: A value that is used if an *attrs*-generated ``__init__`` - is used and no value is passed while instantiating or the attribute is - excluded using ``init=False``. - - If the value is an instance of `attrs.Factory`, its callable will be - used to construct a new value (useful for mutable data types like lists - or dicts). - - If a default is not set (or set manually to `attrs.NOTHING`), a value - *must* be supplied when instantiating; otherwise a `TypeError` will be - raised. - - The default can also be set using decorator notation as shown below. - - .. seealso:: `defaults` - - :param callable factory: Syntactic sugar for - ``default=attr.Factory(factory)``. - - :param validator: `callable` that is called by *attrs*-generated - ``__init__`` methods after the instance has been initialized. They - receive the initialized instance, the :func:`~attrs.Attribute`, and the - passed value. - - The return value is *not* inspected so the validator has to throw an - exception itself. - - If a `list` is passed, its items are treated as validators and must all - pass. - - Validators can be globally disabled and re-enabled using - `attrs.validators.get_disabled` / `attrs.validators.set_disabled`. - - The validator can also be set using decorator notation as shown below. - - .. seealso:: :ref:`validators` - - :type validator: `callable` or a `list` of `callable`\\ s. - - :param repr: Include this attribute in the generated ``__repr__`` method. - If ``True``, include the attribute; if ``False``, omit it. By default, - the built-in ``repr()`` function is used. To override how the attribute - value is formatted, pass a ``callable`` that takes a single value and - returns a string. Note that the resulting string is used as-is, i.e. it - will be used directly *instead* of calling ``repr()`` (the default). - :type repr: a `bool` or a `callable` to use a custom function. - - :param eq: If ``True`` (default), include this attribute in the generated - ``__eq__`` and ``__ne__`` methods that check two instances for - equality. To override how the attribute value is compared, pass a - ``callable`` that takes a single value and returns the value to be - compared. + Create a new field / attribute on a class. - .. seealso:: `comparison` - :type eq: a `bool` or a `callable`. + Identical to `attrs.field`, except it's not keyword-only. - :param order: If ``True`` (default), include this attributes in the - generated ``__lt__``, ``__le__``, ``__gt__`` and ``__ge__`` methods. To - override how the attribute value is ordered, pass a ``callable`` that - takes a single value and returns the value to be ordered. + Consider using `attrs.field` in new code (``attr.ib`` will *never* go away, + though). - .. seealso:: `comparison` - :type order: a `bool` or a `callable`. - - :param cmp: Setting *cmp* is equivalent to setting *eq* and *order* to the - same value. Must not be mixed with *eq* or *order*. - - .. seealso:: `comparison` - :type cmp: a `bool` or a `callable`. - - :param bool | None hash: Include this attribute in the generated - ``__hash__`` method. If ``None`` (default), mirror *eq*'s value. This - is the correct behavior according the Python spec. Setting this value - to anything else than ``None`` is *discouraged*. - - .. seealso:: `hashing` - :param bool init: Include this attribute in the generated ``__init__`` - method. It is possible to set this to ``False`` and set a default - value. In that case this attributed is unconditionally initialized - with the specified default value or factory. - - .. seealso:: `init` - :param callable converter: `callable` that is called by *attrs*-generated - ``__init__`` methods to convert attribute's value to the desired - format. It is given the passed-in value, and the returned value will - be used as the new value of the attribute. The value is converted - before being passed to the validator, if any. + .. warning:: - .. seealso:: :ref:`converters` - :param dict | None metadata: An arbitrary mapping, to be used by - third-party components. See `extending-metadata`. + Does **nothing** unless the class is also decorated with + `attr.s` (or similar)! - :param type: The type of the attribute. Nowadays, the preferred method to - specify the type is using a variable annotation (see :pep:`526`). This - argument is provided for backward compatibility. Regardless of the - approach used, the type will be stored on ``Attribute.type``. - - Please note that *attrs* doesn't do anything with this metadata by - itself. You can use it as part of your own code or for `static type - checking `. - :param bool kw_only: Make this attribute keyword-only in the generated - ``__init__`` (if ``init`` is ``False``, this parameter is ignored). - :param on_setattr: Allows to overwrite the *on_setattr* setting from - `attr.s`. If left `None`, the *on_setattr* value from `attr.s` is used. - Set to `attrs.setters.NO_OP` to run **no** `setattr` hooks for this - attribute -- regardless of the setting in `attr.s`. - :type on_setattr: `callable`, or a list of callables, or `None`, or - `attrs.setters.NO_OP` - :param str | None alias: Override this attribute's parameter name in the - generated ``__init__`` method. If left `None`, default to ``name`` - stripped of leading underscores. See `private-attributes`. .. versionadded:: 15.2.0 *convert* .. versionadded:: 16.3.0 *metadata* .. versionchanged:: 17.1.0 *validator* can be a ``list`` now. .. versionchanged:: 17.1.0 - *hash* is ``None`` and therefore mirrors *eq* by default. + *hash* is `None` and therefore mirrors *eq* by default. .. versionadded:: 17.3.0 *type* .. deprecated:: 17.4.0 *convert* - .. versionadded:: 17.4.0 *converter* as a replacement for the deprecated - *convert* to achieve consistency with other noun-based arguments. + .. versionadded:: 17.4.0 + *converter* as a replacement for the deprecated *convert* to achieve + consistency with other noun-based arguments. .. versionadded:: 18.1.0 ``factory=f`` is syntactic sugar for ``default=attr.Factory(f)``. .. versionadded:: 18.2.0 *kw_only* @@ -310,19 +209,31 @@ def attrib( ) -def _compile_and_eval(script, globs, locs=None, filename=""): +def _compile_and_eval( + script: str, + globs: dict[str, Any] | None, + locs: Mapping[str, object] | None = None, + filename: str = "", +) -> None: """ - "Exec" the script with the given global (globs) and local (locs) variables. + Evaluate the script with the given global (globs) and local (locs) + variables. """ bytecode = compile(script, filename, "exec") eval(bytecode, globs, locs) -def _make_method(name, script, filename, globs): +def _linecache_and_compile( + script: str, + filename: str, + globs: dict[str, Any] | None, + locals: Mapping[str, object] | None = None, +) -> dict[str, Any]: """ - Create the method with the script given and return the method object. + Cache the script with _linecache_, compile it and return the _locals_. """ - locs = {} + + locs = {} if locals is None else locals # In order of debuggers like PDB being able to step through the code, # we add a fake linecache entry. @@ -344,10 +255,10 @@ def _make_method(name, script, filename, globs): _compile_and_eval(script, globs, locs, filename) - return locs[name] + return locs -def _make_attr_tuple_class(cls_name, attr_names): +def _make_attr_tuple_class(cls_name: str, attr_names: list[str]) -> type: """ Create a tuple subclass to hold `Attribute`s for an `attrs` class. @@ -358,35 +269,22 @@ class MyClassAttributes(tuple): x = property(itemgetter(0)) """ attr_class_name = f"{cls_name}Attributes" - attr_class_template = [ - f"class {attr_class_name}(tuple):", - " __slots__ = ()", - ] - if attr_names: - for i, attr_name in enumerate(attr_names): - attr_class_template.append( - f" {attr_name} = _attrs_property(_attrs_itemgetter({i}))" - ) - else: - attr_class_template.append(" pass") - globs = {"_attrs_itemgetter": itemgetter, "_attrs_property": property} - _compile_and_eval("\n".join(attr_class_template), globs) - return globs[attr_class_name] + body = {} + for i, attr_name in enumerate(attr_names): + + def getter(self, i=i): + return self[i] + + body[attr_name] = property(getter) + return type(attr_class_name, (tuple,), body) # Tuple class for extracted attributes from a class definition. # `base_attrs` is a subset of `attrs`. -_Attributes = _make_attr_tuple_class( - "_Attributes", - [ - # all attributes to build dunder methods for - "attrs", - # attributes that have been inherited - "base_attrs", - # map inherited attributes to their originating classes - "base_attrs_map", - ], -) +class _Attributes(NamedTuple): + attrs: type + base_attrs: list[Attribute] + base_attrs_map: dict[str, type] def _is_class_var(annot): @@ -403,36 +301,19 @@ def _is_class_var(annot): if annot.startswith(("'", '"')) and annot.endswith(("'", '"')): annot = annot[1:-1] - return annot.startswith(_classvar_prefixes) + return annot.startswith(_CLASSVAR_PREFIXES) def _has_own_attribute(cls, attrib_name): """ Check whether *cls* defines *attrib_name* (and doesn't just inherit it). """ - attr = getattr(cls, attrib_name, _sentinel) - if attr is _sentinel: - return False - - for base_cls in cls.__mro__[1:]: - a = getattr(base_cls, attrib_name, None) - if attr is a: - return False - - return True - - -def _get_annotations(cls): - """ - Get annotations for *cls*. - """ - if _has_own_attribute(cls, "__annotations__"): - return cls.__annotations__ - - return {} + return attrib_name in cls.__dict__ -def _collect_base_attrs(cls, taken_attr_names): +def _collect_base_attrs( + cls, taken_attr_names +) -> tuple[list[Attribute], dict[str, type]]: """ Collect attr.ibs from base classes of *cls*, except *taken_attr_names*. """ @@ -493,14 +374,14 @@ def _collect_base_attrs_broken(cls, taken_attr_names): def _transform_attrs( cls, these, auto_attribs, kw_only, collect_by_mro, field_transformer -): +) -> _Attributes: """ Transform all `_CountingAttr`s on a class into `Attribute`s. If *these* is passed, use that and don't look for them on the class. - *collect_by_mro* is True, collect them in the correct MRO order, otherwise - use the old -- incorrect -- order. See #428. + If *collect_by_mro* is True, collect them in the correct MRO order, + otherwise use the old -- incorrect -- order. See #428. Return an `_Attributes`. """ @@ -513,7 +394,7 @@ def _transform_attrs( ca_names = { name for name, attr in cd.items() - if isinstance(attr, _CountingAttr) + if attr.__class__ is _CountingAttr } ca_list = [] annot_names = set() @@ -523,12 +404,12 @@ def _transform_attrs( annot_names.add(attr_name) a = cd.get(attr_name, NOTHING) - if not isinstance(a, _CountingAttr): - a = attrib() if a is NOTHING else attrib(default=a) + if a.__class__ is not _CountingAttr: + a = attrib(a) ca_list.append((attr_name, a)) unannotated = ca_names - annot_names - if len(unannotated) > 0: + if unannotated: raise UnannotatedAttributeError( "The following `attr.ib`s lack a type annotation: " + ", ".join( @@ -541,16 +422,14 @@ def _transform_attrs( ( (name, attr) for name, attr in cd.items() - if isinstance(attr, _CountingAttr) + if attr.__class__ is _CountingAttr ), key=lambda e: e[1].counter, ) + fca = Attribute.from_counting_attr own_attrs = [ - Attribute.from_counting_attr( - name=attr_name, ca=ca, type=anns.get(attr_name) - ) - for attr_name, ca in ca_list + fca(attr_name, ca, anns.get(attr_name)) for attr_name, ca in ca_list ] if collect_by_mro: @@ -568,6 +447,10 @@ def _transform_attrs( attrs = base_attrs + own_attrs + if field_transformer is not None: + attrs = tuple(field_transformer(cls, attrs)) + + # Check attr order after executing the field_transformer. # Mandatory vs non-mandatory attr order only matters when they are part of # the __init__ signature and when they aren't kw_only (which are moved to # the end and can be mandatory or non-mandatory in any order, as they will @@ -581,34 +464,27 @@ def _transform_attrs( if had_default is False and a.default is not NOTHING: had_default = True - if field_transformer is not None: - attrs = field_transformer(cls, attrs) - # Resolve default field alias after executing field_transformer. # This allows field_transformer to differentiate between explicit vs # default aliases and supply their own defaults. - attrs = [ - a.evolve(alias=_default_init_alias_for(a.name)) if not a.alias else a - for a in attrs - ] + for a in attrs: + if not a.alias: + # Evolve is very slow, so we hold our nose and do it dirty. + _OBJ_SETATTR.__get__(a)("alias", _default_init_alias_for(a.name)) # Create AttrsClass *after* applying the field_transformer since it may # add or remove attributes! attr_names = [a.name for a in attrs] AttrsClass = _make_attr_tuple_class(cls.__name__, attr_names) - return _Attributes((AttrsClass(attrs), base_attrs, base_attr_map)) + return _Attributes(AttrsClass(attrs), base_attrs, base_attr_map) -def _make_cached_property_getattr( - cached_properties, - original_getattr, - cls, -): +def _make_cached_property_getattr(cached_properties, original_getattr, cls): lines = [ # Wrapped to get `__class__` into closure cell for super() # (It will be replaced with the newly constructed class after construction). - "def wrapper():", + "def wrapper(_cls):", " __class__ = _cls", " def __getattr__(self, item, cached_properties=cached_properties, original_getattr=original_getattr, _cached_setattr_get=_cached_setattr_get):", " func = cached_properties.get(item)", @@ -625,8 +501,12 @@ def _make_cached_property_getattr( else: lines.extend( [ - " if hasattr(super(), '__getattr__'):", - " return super().__getattr__(item)", + " try:", + " return super().__getattribute__(item)", + " except AttributeError:", + " if not hasattr(super(), '__getattr__'):", + " raise", + " return super().__getattr__(item)", " original_error = f\"'{self.__class__.__name__}' object has no attribute '{item}'\"", " raise AttributeError(original_error)", ] @@ -635,7 +515,7 @@ def _make_cached_property_getattr( lines.extend( [ " return __getattr__", - "__getattr__ = wrapper()", + "__getattr__ = wrapper(_cls)", ] ) @@ -643,17 +523,13 @@ def _make_cached_property_getattr( glob = { "cached_properties": cached_properties, - "_cached_setattr_get": _obj_setattr.__get__, - "_cls": cls, + "_cached_setattr_get": _OBJ_SETATTR.__get__, "original_getattr": original_getattr, } - return _make_method( - "__getattr__", - "\n".join(lines), - unique_filename, - glob, - ) + return _linecache_and_compile( + "\n".join(lines), unique_filename, glob, locals={"_cls": cls} + )["__getattr__"] def _frozen_setattrs(self, name, value): @@ -664,18 +540,82 @@ def _frozen_setattrs(self, name, value): "__cause__", "__context__", "__traceback__", + "__suppress_context__", + "__notes__", ): BaseException.__setattr__(self, name, value) return - raise FrozenInstanceError() + raise FrozenInstanceError def _frozen_delattrs(self, name): """ Attached to frozen classes as __delattr__. """ - raise FrozenInstanceError() + if isinstance(self, BaseException) and name in ("__notes__",): + BaseException.__delattr__(self, name) + return + + raise FrozenInstanceError + + +def evolve(*args, **changes): + """ + Create a new instance, based on the first positional argument with + *changes* applied. + + .. tip:: + + On Python 3.13 and later, you can also use `copy.replace` instead. + + Args: + + inst: + Instance of a class with *attrs* attributes. *inst* must be passed + as a positional argument. + + changes: + Keyword changes in the new copy. + + Returns: + A copy of inst with *changes* incorporated. + + Raises: + TypeError: + If *attr_name* couldn't be found in the class ``__init__``. + + attrs.exceptions.NotAnAttrsClassError: + If *cls* is not an *attrs* class. + + .. versionadded:: 17.1.0 + .. deprecated:: 23.1.0 + It is now deprecated to pass the instance using the keyword argument + *inst*. It will raise a warning until at least April 2024, after which + it will become an error. Always pass the instance as a positional + argument. + .. versionchanged:: 24.1.0 + *inst* can't be passed as a keyword argument anymore. + """ + try: + (inst,) = args + except ValueError: + msg = ( + f"evolve() takes 1 positional argument, but {len(args)} were given" + ) + raise TypeError(msg) from None + + cls = inst.__class__ + attrs = fields(cls) + for a in attrs: + if not a.init: + continue + attr_name = a.name # To deal with private attributes. + init_name = a.alias + if init_name not in changes: + changes[init_name] = getattr(inst, attr_name) + + return cls(**changes) class _ClassBuilder: @@ -684,6 +624,7 @@ class _ClassBuilder: """ __slots__ = ( + "_add_method_dunders", "_attr_names", "_attrs", "_base_attr_map", @@ -693,20 +634,22 @@ class _ClassBuilder: "_cls_dict", "_delete_attribs", "_frozen", - "_has_pre_init", - "_pre_init_has_args", + "_has_custom_setattr", "_has_post_init", + "_has_pre_init", "_is_exc", "_on_setattr", + "_pre_init_has_args", + "_repr_added", + "_script_snippets", "_slots", "_weakref_slot", "_wrote_own_setattr", - "_has_custom_setattr", ) def __init__( self, - cls, + cls: type, these, slots, frozen, @@ -764,7 +707,7 @@ def __init__( self._wrote_own_setattr = True elif on_setattr in ( - _ng_default_on_setattr, + _DEFAULT_ON_SETATTR, setters.validate, setters.convert, ): @@ -779,7 +722,7 @@ def __init__( break if ( ( - on_setattr == _ng_default_on_setattr + on_setattr == _DEFAULT_ON_SETATTR and not (has_validator or has_converter) ) or (on_setattr == setters.validate and not has_validator) @@ -796,37 +739,65 @@ def __init__( self._cls_dict["__setstate__"], ) = self._make_getstate_setstate() + # tuples of script, globs, hook + self._script_snippets: list[ + tuple[str, dict, Callable[[dict, dict], Any]] + ] = [] + self._repr_added = False + + # We want to only do this check once; in 99.9% of cases these + # exist. + if not hasattr(self._cls, "__module__") or not hasattr( + self._cls, "__qualname__" + ): + self._add_method_dunders = self._add_method_dunders_safe + else: + self._add_method_dunders = self._add_method_dunders_unsafe + def __repr__(self): return f"<_ClassBuilder(cls={self._cls.__name__})>" - if PY310: - import abc - - def build_class(self): - """ - Finalize class based on the accumulated configuration. + def _eval_snippets(self) -> None: + """ + Evaluate any registered snippets in one go. + """ + script = "\n".join([snippet[0] for snippet in self._script_snippets]) + globs = {} + for _, snippet_globs, _ in self._script_snippets: + globs.update(snippet_globs) + + locs = _linecache_and_compile( + script, + _generate_unique_filename(self._cls, "methods"), + globs, + ) - Builder cannot be used after calling this method. - """ - if self._slots is True: - return self._create_slots_class() + for _, _, hook in self._script_snippets: + hook(self._cls_dict, locs) - return self.abc.update_abstractmethods( - self._patch_original_class() - ) - - else: + def build_class(self): + """ + Finalize class based on the accumulated configuration. - def build_class(self): - """ - Finalize class based on the accumulated configuration. + Builder cannot be used after calling this method. + """ + self._eval_snippets() + if self._slots is True: + cls = self._create_slots_class() + else: + cls = self._patch_original_class() + if PY_3_10_PLUS: + cls = abc.update_abstractmethods(cls) - Builder cannot be used after calling this method. - """ - if self._slots is True: - return self._create_slots_class() + # The method gets only called if it's not inherited from a base class. + # _has_own_attribute does NOT work properly for classmethods. + if ( + getattr(cls, "__attrs_init_subclass__", None) + and "__attrs_init_subclass__" not in cls.__dict__ + ): + cls.__attrs_init_subclass__() - return self._patch_original_class() + return cls def _patch_original_class(self): """ @@ -840,7 +811,7 @@ def _patch_original_class(self): for name in self._attr_names: if ( name not in base_names - and getattr(cls, name, _sentinel) is not _sentinel + and getattr(cls, name, _SENTINEL) is not _SENTINEL ): # An AttributeError can happen if a base class defines a # class variable and we want to set an attribute with the @@ -860,7 +831,7 @@ def _patch_original_class(self): cls.__attrs_own_setattr__ = False if not self._has_custom_setattr: - cls.__setattr__ = _obj_setattr + cls.__setattr__ = _OBJ_SETATTR return cls @@ -888,7 +859,7 @@ def _create_slots_class(self): if not self._has_custom_setattr: for base_cls in self._cls.__bases__: if base_cls.__dict__.get("__attrs_own_setattr__", False): - cd["__setattr__"] = _obj_setattr + cd["__setattr__"] = _OBJ_SETATTR break # Traverse the MRO to collect existing slots @@ -916,30 +887,23 @@ def _create_slots_class(self): ): names += ("__weakref__",) - if PY_3_8_PLUS: - cached_properties = { - name: cached_property.func - for name, cached_property in cd.items() - if isinstance(cached_property, functools.cached_property) - } - else: - # `functools.cached_property` was introduced in 3.8. - # So can't be used before this. - cached_properties = {} + cached_properties = { + name: cached_prop.func + for name, cached_prop in cd.items() + if isinstance(cached_prop, cached_property) + } # Collect methods with a `__class__` reference that are shadowed in the new class. # To know to update them. additional_closure_functions_to_update = [] if cached_properties: - # Add cached properties to names for slotting. - names += tuple(cached_properties.keys()) - - for name in cached_properties: - # Clear out function from class to avoid clashing. - del cd[name] - class_annotations = _get_annotations(self._cls) for name, func in cached_properties.items(): + # Add cached properties to names for slotting. + names += (name,) + # Clear out function from class to avoid clashing. + del cd[name] + additional_closure_functions_to_update.append(func) annotation = inspect.signature(func).return_annotation if annotation is not inspect.Parameter.empty: class_annotations[name] = annotation @@ -968,7 +932,7 @@ def _create_slots_class(self): slot_names = [name for name in slot_names if name not in reused_slots] cd.update(reused_slots) if self._cache_hash: - slot_names.append(_hash_cache_field) + slot_names.append(_HASH_CACHE_FIELD) cd["__slots__"] = tuple(slot_names) @@ -1011,14 +975,17 @@ def _create_slots_class(self): return cls def add_repr(self, ns): - self._cls_dict["__repr__"] = self._add_method_dunders( - _make_repr(self._attrs, ns, self._cls) - ) + script, globs = _make_repr_script(self._attrs, ns) + + def _attach_repr(cls_dict, globs): + cls_dict["__repr__"] = self._add_method_dunders(globs["__repr__"]) + + self._script_snippets.append((script, globs, _attach_repr)) + self._repr_added = True return self def add_str(self): - repr = self._cls_dict.get("__repr__") - if repr is None: + if not self._repr_added: msg = "__str__ can only be generated if a __repr__ exists." raise ValueError(msg) @@ -1049,7 +1016,7 @@ def slots_setstate(self, state): """ Automatically created by attrs. """ - __bound_setattr = _obj_setattr.__get__(self) + __bound_setattr = _OBJ_SETATTR.__get__(self) if isinstance(state, tuple): # Backward compatibility with attrs instances pickled with # attrs versions before v22.2.0 which stored tuples. @@ -1065,7 +1032,7 @@ def slots_setstate(self, state): # indicate that the first call to __hash__ should be a cache # miss. if hash_caching_enabled: - __bound_setattr(_hash_cache_field, None) + __bound_setattr(_HASH_CACHE_FIELD, None) return slots_getstate, slots_setstate @@ -1074,35 +1041,49 @@ def make_unhashable(self): return self def add_hash(self): - self._cls_dict["__hash__"] = self._add_method_dunders( - _make_hash( - self._cls, - self._attrs, - frozen=self._frozen, - cache_hash=self._cache_hash, - ) + script, globs = _make_hash_script( + self._cls, + self._attrs, + frozen=self._frozen, + cache_hash=self._cache_hash, ) + def attach_hash(cls_dict: dict, locs: dict) -> None: + cls_dict["__hash__"] = self._add_method_dunders(locs["__hash__"]) + + self._script_snippets.append((script, globs, attach_hash)) + return self def add_init(self): - self._cls_dict["__init__"] = self._add_method_dunders( - _make_init( - self._cls, - self._attrs, - self._has_pre_init, - self._pre_init_has_args, - self._has_post_init, - self._frozen, - self._slots, - self._cache_hash, - self._base_attr_map, - self._is_exc, - self._on_setattr, - attrs_init=False, - ) + script, globs, annotations = _make_init_script( + self._cls, + self._attrs, + self._has_pre_init, + self._pre_init_has_args, + self._has_post_init, + self._frozen, + self._slots, + self._cache_hash, + self._base_attr_map, + self._is_exc, + self._on_setattr, + attrs_init=False, ) + def _attach_init(cls_dict, globs): + init = globs["__init__"] + init.__annotations__ = annotations + cls_dict["__init__"] = self._add_method_dunders(init) + + self._script_snippets.append((script, globs, _attach_init)) + + return self + + def add_replace(self): + self._cls_dict["__replace__"] = self._add_method_dunders( + lambda self, **changes: evolve(self, **changes) + ) return self def add_match_args(self): @@ -1113,32 +1094,41 @@ def add_match_args(self): ) def add_attrs_init(self): - self._cls_dict["__attrs_init__"] = self._add_method_dunders( - _make_init( - self._cls, - self._attrs, - self._has_pre_init, - self._pre_init_has_args, - self._has_post_init, - self._frozen, - self._slots, - self._cache_hash, - self._base_attr_map, - self._is_exc, - self._on_setattr, - attrs_init=True, - ) + script, globs, annotations = _make_init_script( + self._cls, + self._attrs, + self._has_pre_init, + self._pre_init_has_args, + self._has_post_init, + self._frozen, + self._slots, + self._cache_hash, + self._base_attr_map, + self._is_exc, + self._on_setattr, + attrs_init=True, ) + def _attach_attrs_init(cls_dict, globs): + init = globs["__attrs_init__"] + init.__annotations__ = annotations + cls_dict["__attrs_init__"] = self._add_method_dunders(init) + + self._script_snippets.append((script, globs, _attach_attrs_init)) + return self def add_eq(self): cd = self._cls_dict - cd["__eq__"] = self._add_method_dunders( - _make_eq(self._cls, self._attrs) - ) - cd["__ne__"] = self._add_method_dunders(_make_ne()) + script, globs = _make_eq_script(self._attrs) + + def _attach_eq(cls_dict, globs): + cls_dict["__eq__"] = self._add_method_dunders(globs["__eq__"]) + + self._script_snippets.append((script, globs, _attach_eq)) + + cd["__ne__"] = __ne__ return self @@ -1153,9 +1143,6 @@ def add_order(self): return self def add_setattr(self): - if self._frozen: - return self - sa_attrs = {} for a in self._attrs: on_setattr = a.on_setattr or self._on_setattr @@ -1179,7 +1166,7 @@ def __setattr__(self, name, val): else: nval = hook(self, a, val) - _obj_setattr(self, name, nval) + _OBJ_SETATTR(self, name, nval) self._cls_dict["__attrs_own_setattr__"] = True self._cls_dict["__setattr__"] = self._add_method_dunders(__setattr__) @@ -1187,7 +1174,21 @@ def __setattr__(self, name, val): return self - def _add_method_dunders(self, method): + def _add_method_dunders_unsafe(self, method: Callable) -> Callable: + """ + Add __module__ and __qualname__ to a *method*. + """ + method.__module__ = self._cls.__module__ + + method.__qualname__ = f"{self._cls.__qualname__}.{method.__name__}" + + method.__doc__ = ( + f"Method generated by attrs for class {self._cls.__qualname__}." + ) + + return method + + def _add_method_dunders_safe(self, method: Callable) -> Callable: """ Add __module__ and __qualname__ to a *method* if possible. """ @@ -1198,10 +1199,7 @@ def _add_method_dunders(self, method): method.__qualname__ = f"{self._cls.__qualname__}.{method.__name__}" with contextlib.suppress(AttributeError): - method.__doc__ = ( - "Method generated by attrs for class " - f"{self._cls.__qualname__}." - ) + method.__doc__ = f"Method generated by attrs for class {self._cls.__qualname__}." return method @@ -1333,238 +1331,28 @@ def attrs( A class decorator that adds :term:`dunder methods` according to the specified attributes using `attr.ib` or the *these* argument. - Please consider using `attrs.define` / `attrs.frozen` in new code - (``attr.s`` will *never* go away, though). - - :param these: A dictionary of name to `attr.ib` mappings. This is useful - to avoid the definition of your attributes within the class body - because you can't (e.g. if you want to add ``__repr__`` methods to - Django models) or don't want to. - - If *these* is not ``None``, *attrs* will *not* search the class body - for attributes and will *not* remove any attributes from it. - - The order is deduced from the order of the attributes inside *these*. - - :type these: `dict` of `str` to `attr.ib` - - :param str repr_ns: When using nested classes, there's no way in Python 2 - to automatically detect that. Therefore it's possible to set the - namespace explicitly for a more meaningful ``repr`` output. - :param bool auto_detect: Instead of setting the *init*, *repr*, *eq*, - *order*, and *hash* arguments explicitly, assume they are set to - ``True`` **unless any** of the involved methods for one of the - arguments is implemented in the *current* class (i.e. it is *not* - inherited from some base class). - - So for example by implementing ``__eq__`` on a class yourself, *attrs* - will deduce ``eq=False`` and will create *neither* ``__eq__`` *nor* - ``__ne__`` (but Python classes come with a sensible ``__ne__`` by - default, so it *should* be enough to only implement ``__eq__`` in most - cases). - - .. warning:: - - If you prevent *attrs* from creating the ordering methods for you - (``order=False``, e.g. by implementing ``__le__``), it becomes - *your* responsibility to make sure its ordering is sound. The best - way is to use the `functools.total_ordering` decorator. - - - Passing ``True`` or ``False`` to *init*, *repr*, *eq*, *order*, *cmp*, - or *hash* overrides whatever *auto_detect* would determine. - - :param bool repr: Create a ``__repr__`` method with a human readable - representation of *attrs* attributes.. - :param bool str: Create a ``__str__`` method that is identical to - ``__repr__``. This is usually not necessary except for `Exception`\ s. - :param bool | None eq: If ``True`` or ``None`` (default), add ``__eq__`` - and ``__ne__`` methods that check two instances for equality. - - They compare the instances as if they were tuples of their *attrs* - attributes if and only if the types of both classes are *identical*! - - .. seealso:: `comparison` - :param bool | None order: If ``True``, add ``__lt__``, ``__le__``, - ``__gt__``, and ``__ge__`` methods that behave like *eq* above and - allow instances to be ordered. If ``None`` (default) mirror value of - *eq*. - - .. seealso:: `comparison` - :param bool | None cmp: Setting *cmp* is equivalent to setting *eq* and - *order* to the same value. Must not be mixed with *eq* or *order*. - - .. seealso:: `comparison` - :param bool | None unsafe_hash: If ``None`` (default), the ``__hash__`` - method is generated according how *eq* and *frozen* are set. - - 1. If *both* are True, *attrs* will generate a ``__hash__`` for you. - 2. If *eq* is True and *frozen* is False, ``__hash__`` will be set to - None, marking it unhashable (which it is). - 3. If *eq* is False, ``__hash__`` will be left untouched meaning the - ``__hash__`` method of the base class will be used (if base class is - ``object``, this means it will fall back to id-based hashing.). - - Although not recommended, you can decide for yourself and force *attrs* - to create one (e.g. if the class is immutable even though you didn't - freeze it programmatically) by passing ``True`` or not. Both of these - cases are rather special and should be used carefully. - - .. seealso:: - - - Our documentation on `hashing`, - - Python's documentation on `object.__hash__`, - - and the `GitHub issue that led to the default \ - behavior `_ for - more details. - - :param bool | None hash: Alias for *unsafe_hash*. *unsafe_hash* takes - precedence. - :param bool init: Create a ``__init__`` method that initializes the *attrs* - attributes. Leading underscores are stripped for the argument name. If - a ``__attrs_pre_init__`` method exists on the class, it will be called - before the class is initialized. If a ``__attrs_post_init__`` method - exists on the class, it will be called after the class is fully - initialized. - - If ``init`` is ``False``, an ``__attrs_init__`` method will be injected - instead. This allows you to define a custom ``__init__`` method that - can do pre-init work such as ``super().__init__()``, and then call - ``__attrs_init__()`` and ``__attrs_post_init__()``. - - .. seealso:: `init` - :param bool slots: Create a :term:`slotted class ` that's - more memory-efficient. Slotted classes are generally superior to the - default dict classes, but have some gotchas you should know about, so - we encourage you to read the :term:`glossary entry `. - :param bool frozen: Make instances immutable after initialization. If - someone attempts to modify a frozen instance, - `attrs.exceptions.FrozenInstanceError` is raised. - - .. note:: - - 1. This is achieved by installing a custom ``__setattr__`` method - on your class, so you can't implement your own. - - 2. True immutability is impossible in Python. - - 3. This *does* have a minor a runtime performance `impact - ` when initializing new instances. In other words: - ``__init__`` is slightly slower with ``frozen=True``. - - 4. If a class is frozen, you cannot modify ``self`` in - ``__attrs_post_init__`` or a self-written ``__init__``. You can - circumvent that limitation by using ``object.__setattr__(self, - "attribute_name", value)``. - - 5. Subclasses of a frozen class are frozen too. - - :param bool weakref_slot: Make instances weak-referenceable. This has no - effect unless ``slots`` is also enabled. - :param bool auto_attribs: If ``True``, collect :pep:`526`-annotated - attributes from the class body. - - In this case, you **must** annotate every field. If *attrs* encounters - a field that is set to an `attr.ib` but lacks a type annotation, an - `attr.exceptions.UnannotatedAttributeError` is raised. Use - ``field_name: typing.Any = attr.ib(...)`` if you don't want to set a - type. - - If you assign a value to those attributes (e.g. ``x: int = 42``), that - value becomes the default value like if it were passed using - ``attr.ib(default=42)``. Passing an instance of `attrs.Factory` also - works as expected in most cases (see warning below). - - Attributes annotated as `typing.ClassVar`, and attributes that are - neither annotated nor set to an `attr.ib` are **ignored**. - - .. warning:: - For features that use the attribute name to create decorators (e.g. - :ref:`validators `), you still *must* assign `attr.ib` - to them. Otherwise Python will either not find the name or try to - use the default value to call e.g. ``validator`` on it. - - These errors can be quite confusing and probably the most common bug - report on our bug tracker. - - :param bool kw_only: Make all attributes keyword-only in the generated - ``__init__`` (if ``init`` is ``False``, this parameter is ignored). - :param bool cache_hash: Ensure that the object's hash code is computed only - once and stored on the object. If this is set to ``True``, hashing - must be either explicitly or implicitly enabled for this class. If the - hash code is cached, avoid any reassignments of fields involved in hash - code computation or mutations of the objects those fields point to - after object creation. If such changes occur, the behavior of the - object's hash code is undefined. - :param bool auto_exc: If the class subclasses `BaseException` (which - implicitly includes any subclass of any exception), the following - happens to behave like a well-behaved Python exceptions class: - - - the values for *eq*, *order*, and *hash* are ignored and the - instances compare and hash by the instance's ids (N.B. *attrs* will - *not* remove existing implementations of ``__hash__`` or the equality - methods. It just won't add own ones.), - - all attributes that are either passed into ``__init__`` or have a - default value are additionally available as a tuple in the ``args`` - attribute, - - the value of *str* is ignored leaving ``__str__`` to base classes. - :param bool collect_by_mro: Setting this to `True` fixes the way *attrs* - collects attributes from base classes. The default behavior is - incorrect in certain cases of multiple inheritance. It should be on by - default but is kept off for backward-compatibility. - - .. seealso:: - Issue `#428 `_ - - :param bool | None getstate_setstate: - .. note:: - This is usually only interesting for slotted classes and you should - probably just set *auto_detect* to `True`. - - If `True`, ``__getstate__`` and ``__setstate__`` are generated and - attached to the class. This is necessary for slotted classes to be - pickleable. If left `None`, it's `True` by default for slotted classes - and ``False`` for dict classes. - - If *auto_detect* is `True`, and *getstate_setstate* is left `None`, and - **either** ``__getstate__`` or ``__setstate__`` is detected directly on - the class (i.e. not inherited), it is set to `False` (this is usually - what you want). - - :param on_setattr: A callable that is run whenever the user attempts to set - an attribute (either by assignment like ``i.x = 42`` or by using - `setattr` like ``setattr(i, "x", 42)``). It receives the same arguments - as validators: the instance, the attribute that is being modified, and - the new value. - - If no exception is raised, the attribute is set to the return value of - the callable. - - If a list of callables is passed, they're automatically wrapped in an - `attrs.setters.pipe`. - :type on_setattr: `callable`, or a list of callables, or `None`, or - `attrs.setters.NO_OP` - - :param callable | None field_transformer: - A function that is called with the original class object and all fields - right before *attrs* finalizes the class. You can use this, e.g., to - automatically add converters or validators to fields based on their - types. - - .. seealso:: `transform-fields` - - :param bool match_args: - If `True` (default), set ``__match_args__`` on the class to support - :pep:`634` (Structural Pattern Matching). It is a tuple of all - non-keyword-only ``__init__`` parameter names on Python 3.10 and later. - Ignored on older Python versions. + Consider using `attrs.define` / `attrs.frozen` in new code (``attr.s`` will + *never* go away, though). + + Args: + repr_ns (str): + When using nested classes, there was no way in Python 2 to + automatically detect that. This argument allows to set a custom + name for a more meaningful ``repr`` output. This argument is + pointless in Python 3 and is therefore deprecated. + + .. caution:: + Refer to `attrs.define` for the rest of the parameters, but note that they + can have different defaults. + + Notably, leaving *on_setattr* as `None` will **not** add any hooks. .. versionadded:: 16.0.0 *slots* .. versionadded:: 16.1.0 *frozen* .. versionadded:: 16.3.0 *str* .. versionadded:: 16.3.0 Support for ``__attrs_post_init__``. .. versionchanged:: 17.1.0 - *hash* supports ``None`` as value which is also the default now. + *hash* supports `None` as value which is also the default now. .. versionadded:: 17.3.0 *auto_attribs* .. versionchanged:: 18.1.0 If *these* is passed, no attributes are deleted from the class body. @@ -1595,10 +1383,29 @@ def attrs( .. versionadded:: 21.3.0 *match_args* .. versionadded:: 22.2.0 *unsafe_hash* as an alias for *hash* (for :pep:`681` compliance). - """ + .. deprecated:: 24.1.0 *repr_ns* + .. versionchanged:: 24.1.0 + Instances are not compared as tuples of attributes anymore, but using a + big ``and`` condition. This is faster and has more correct behavior for + uncomparable values like `math.nan`. + .. versionadded:: 24.1.0 + If a class has an *inherited* classmethod called + ``__attrs_init_subclass__``, it is executed after the class is created. + .. deprecated:: 24.1.0 *hash* is deprecated in favor of *unsafe_hash*. + """ + if repr_ns is not None: + import warnings + + warnings.warn( + DeprecationWarning( + "The `repr_ns` argument is deprecated and will be removed in or after August 2025." + ), + stacklevel=2, + ) + eq_, order_ = _determine_attrs_eq_order(cmp, eq, order, None) - # unsafe_hash takes precedence due to PEP 681. + # unsafe_hash takes precedence due to PEP 681. if unsafe_hash is not None: hash = unsafe_hash @@ -1638,10 +1445,12 @@ def wrap(cls): has_own_setattr, field_transformer, ) + if _determine_whether_to_implement( cls, repr, auto_detect, ("__repr__",) ): builder.add_repr(repr_ns) + if str is True: builder.add_str() @@ -1655,7 +1464,8 @@ def wrap(cls): ): builder.add_order() - builder.add_setattr() + if not frozen: + builder.add_setattr() nonlocal hash if ( @@ -1698,8 +1508,11 @@ def wrap(cls): msg = "Invalid value for cache_hash. To use hash caching, init must be True." raise TypeError(msg) + if PY_3_13_PLUS and not _has_own_attribute(cls, "__replace__"): + builder.add_replace() + if ( - PY310 + PY_3_10_PLUS and match_args and not _has_own_attribute(cls, "__match_args__") ): @@ -1708,7 +1521,7 @@ def wrap(cls): return builder.build_class() # maybe_cls's type depends on the usage of the decorator. It's a class - # if it's used as `@attrs` but ``None`` if used as `@attrs()`. + # if it's used as `@attrs` but `None` if used as `@attrs()`. if maybe_cls is None: return wrap @@ -1730,7 +1543,7 @@ def _has_frozen_base_class(cls): return cls.__setattr__ is _frozen_setattrs -def _generate_unique_filename(cls, func_name): +def _generate_unique_filename(cls: type, func_name: str) -> str: """ Create a "filename" suitable for a function being generated. """ @@ -1740,15 +1553,16 @@ def _generate_unique_filename(cls, func_name): ) -def _make_hash(cls, attrs, frozen, cache_hash): +def _make_hash_script( + cls: type, attrs: list[Attribute], frozen: bool, cache_hash: bool +) -> tuple[str, dict]: attrs = tuple( a for a in attrs if a.hash is True or (a.hash is None and a.eq is True) ) tab = " " - unique_filename = _generate_unique_filename(cls, "hash") - type_hash = hash(unique_filename) + type_hash = hash(_generate_unique_filename(cls, "hash")) # If eq is custom generated, we need to include the functions in globs globs = {} @@ -1793,89 +1607,85 @@ def append_hash_computation_lines(prefix, indent): method_lines.append(indent + " " + closing_braces) if cache_hash: - method_lines.append(tab + f"if self.{_hash_cache_field} is None:") + method_lines.append(tab + f"if self.{_HASH_CACHE_FIELD} is None:") if frozen: append_hash_computation_lines( - f"object.__setattr__(self, '{_hash_cache_field}', ", tab * 2 + f"object.__setattr__(self, '{_HASH_CACHE_FIELD}', ", tab * 2 ) method_lines.append(tab * 2 + ")") # close __setattr__ else: append_hash_computation_lines( - f"self.{_hash_cache_field} = ", tab * 2 + f"self.{_HASH_CACHE_FIELD} = ", tab * 2 ) - method_lines.append(tab + f"return self.{_hash_cache_field}") + method_lines.append(tab + f"return self.{_HASH_CACHE_FIELD}") else: append_hash_computation_lines("return ", tab) script = "\n".join(method_lines) - return _make_method("__hash__", script, unique_filename, globs) + return script, globs -def _add_hash(cls, attrs): +def _add_hash(cls: type, attrs: list[Attribute]): """ Add a hash method to *cls*. """ - cls.__hash__ = _make_hash(cls, attrs, frozen=False, cache_hash=False) + script, globs = _make_hash_script( + cls, attrs, frozen=False, cache_hash=False + ) + _compile_and_eval( + script, globs, filename=_generate_unique_filename(cls, "__hash__") + ) + cls.__hash__ = globs["__hash__"] return cls -def _make_ne(): +def __ne__(self, other): """ - Create __ne__ method. + Check equality and either forward a NotImplemented or + return the result negated. """ + result = self.__eq__(other) + if result is NotImplemented: + return NotImplemented - def __ne__(self, other): - """ - Check equality and either forward a NotImplemented or - return the result negated. - """ - result = self.__eq__(other) - if result is NotImplemented: - return NotImplemented - - return not result - - return __ne__ + return not result -def _make_eq(cls, attrs): +def _make_eq_script(attrs: list) -> tuple[str, dict]: """ Create __eq__ method for *cls* with *attrs*. """ attrs = [a for a in attrs if a.eq] - unique_filename = _generate_unique_filename(cls, "eq") lines = [ "def __eq__(self, other):", " if other.__class__ is not self.__class__:", " return NotImplemented", ] - # We can't just do a big self.x = other.x and... clause due to - # irregularities like nan == nan is false but (nan,) == (nan,) is true. globs = {} if attrs: lines.append(" return (") - others = [" ) == ("] for a in attrs: if a.eq_key: cmp_name = f"_{a.name}_key" # Add the key function to the global namespace # of the evaluated function. globs[cmp_name] = a.eq_key - lines.append(f" {cmp_name}(self.{a.name}),") - others.append(f" {cmp_name}(other.{a.name}),") + lines.append( + f" {cmp_name}(self.{a.name}) == {cmp_name}(other.{a.name})" + ) else: - lines.append(f" self.{a.name},") - others.append(f" other.{a.name},") - - lines += [*others, " )"] + lines.append(f" self.{a.name} == other.{a.name}") + if a is not attrs[-1]: + lines[-1] = f"{lines[-1]} and" + lines.append(" )") else: lines.append(" return True") script = "\n".join(lines) - return _make_method("__eq__", script, unique_filename, globs) + return script, globs def _make_order(cls, attrs): @@ -1941,14 +1751,20 @@ def _add_eq(cls, attrs=None): if attrs is None: attrs = cls.__attrs_attrs__ - cls.__eq__ = _make_eq(cls, attrs) - cls.__ne__ = _make_ne() + script, globs = _make_eq_script(attrs) + _compile_and_eval( + script, globs, filename=_generate_unique_filename(cls, "__eq__") + ) + cls.__eq__ = globs["__eq__"] + cls.__ne__ = __ne__ return cls -def _make_repr(attrs, ns, cls): - unique_filename = _generate_unique_filename(cls, "repr") +def _make_repr_script(attrs, ns) -> tuple[str, dict]: + """ + Create the source and globs for a __repr__ and return it. + """ # Figure out which attributes to include, and which function to use to # format them. The a.repr value can be either bool or a custom # callable. @@ -1999,9 +1815,7 @@ def _make_repr(attrs, ns, cls): " already_repring.remove(id(self))", ] - return _make_method( - "__repr__", "\n".join(lines), unique_filename, globs=globs - ) + return "\n".join(lines), globs def _add_repr(cls, ns=None, attrs=None): @@ -2011,7 +1825,11 @@ def _add_repr(cls, ns=None, attrs=None): if attrs is None: attrs = cls.__attrs_attrs__ - cls.__repr__ = _make_repr(attrs, ns, cls) + script, globs = _make_repr_script(attrs, ns) + _compile_and_eval( + script, globs, filename=_generate_unique_filename(cls, "__repr__") + ) + cls.__repr__ = globs["__repr__"] return cls @@ -2022,13 +1840,17 @@ def fields(cls): The tuple also allows accessing the fields by their names (see below for examples). - :param type cls: Class to introspect. + Args: + cls (type): Class to introspect. - :raise TypeError: If *cls* is not a class. - :raise attrs.exceptions.NotAnAttrsClassError: If *cls* is not an *attrs* - class. + Raises: + TypeError: If *cls* is not a class. - :rtype: tuple (with name accessors) of `attrs.Attribute` + attrs.exceptions.NotAnAttrsClassError: + If *cls* is not an *attrs* class. + + Returns: + tuple (with name accessors) of `attrs.Attribute` .. versionchanged:: 16.2.0 Returned tuple allows accessing the fields by name. @@ -2059,16 +1881,20 @@ def fields(cls): def fields_dict(cls): """ - Return an ordered dictionary of *attrs* attributes for a class, whose - keys are the attribute names. + Return an ordered dictionary of *attrs* attributes for a class, whose keys + are the attribute names. + + Args: + cls (type): Class to introspect. - :param type cls: Class to introspect. + Raises: + TypeError: If *cls* is not a class. - :raise TypeError: If *cls* is not a class. - :raise attrs.exceptions.NotAnAttrsClassError: If *cls* is not an *attrs* - class. + attrs.exceptions.NotAnAttrsClassError: + If *cls* is not an *attrs* class. - :rtype: dict + Returns: + dict[str, attrs.Attribute]: Dict of attribute name to definition .. versionadded:: 18.1.0 """ @@ -2088,7 +1914,8 @@ def validate(inst): Leaves all exceptions through. - :param inst: Instance of a class with *attrs* attributes. + Args: + inst: Instance of a class with *attrs* attributes. """ if _config._run_validators is False: return @@ -2099,18 +1926,15 @@ def validate(inst): v(inst, a, getattr(inst, a.name)) -def _is_slot_cls(cls): - return "__slots__" in cls.__dict__ - - def _is_slot_attr(a_name, base_attr_map): """ Check if the attribute name comes from a slot class. """ - return a_name in base_attr_map and _is_slot_cls(base_attr_map[a_name]) + cls = base_attr_map.get(a_name) + return cls and "__slots__" in cls.__dict__ -def _make_init( +def _make_init_script( cls, attrs, pre_init, @@ -2123,7 +1947,7 @@ def _make_init( is_exc, cls_on_setattr, attrs_init, -): +) -> tuple[str, dict, dict]: has_cls_on_setattr = ( cls_on_setattr is not None and cls_on_setattr is not setters.NO_OP ) @@ -2151,8 +1975,6 @@ def _make_init( elif has_cls_on_setattr and a.on_setattr is not setters.NO_OP: needs_cached_setattr = True - unique_filename = _generate_unique_filename(cls, "init") - script, globs, annotations = _attrs_to_init_script( filtered_attrs, frozen, @@ -2165,7 +1987,7 @@ def _make_init( is_exc, needs_cached_setattr, has_cls_on_setattr, - attrs_init, + "__attrs_init__" if attrs_init else "__init__", ) if cls.__module__ in sys.modules: # This makes typing.get_type_hints(CLS.__init__) resolve string types. @@ -2176,39 +1998,29 @@ def _make_init( if needs_cached_setattr: # Save the lookup overhead in __init__ if we need to circumvent # setattr hooks. - globs["_cached_setattr_get"] = _obj_setattr.__get__ + globs["_cached_setattr_get"] = _OBJ_SETATTR.__get__ - init = _make_method( - "__attrs_init__" if attrs_init else "__init__", - script, - unique_filename, - globs, - ) - init.__annotations__ = annotations - - return init + return script, globs, annotations -def _setattr(attr_name, value_var, has_on_setattr): +def _setattr(attr_name: str, value_var: str, has_on_setattr: bool) -> str: """ Use the cached object.setattr to set *attr_name* to *value_var*. """ return f"_setattr('{attr_name}', {value_var})" -def _setattr_with_converter(attr_name, value_var, has_on_setattr): +def _setattr_with_converter( + attr_name: str, value_var: str, has_on_setattr: bool, converter: Converter +) -> str: """ Use the cached object.setattr to set *attr_name* to *value_var*, but run its converter first. """ - return "_setattr('%s', %s(%s))" % ( - attr_name, - _init_converter_pat % (attr_name,), - value_var, - ) + return f"_setattr('{attr_name}', {converter._fmt_converter_call(attr_name, value_var)})" -def _assign(attr_name, value, has_on_setattr): +def _assign(attr_name: str, value: str, has_on_setattr: bool) -> str: """ Unless *attr_name* has an on_setattr hook, use normal assignment. Otherwise relegate to _setattr. @@ -2219,90 +2031,100 @@ def _assign(attr_name, value, has_on_setattr): return f"self.{attr_name} = {value}" -def _assign_with_converter(attr_name, value_var, has_on_setattr): +def _assign_with_converter( + attr_name: str, value_var: str, has_on_setattr: bool, converter: Converter +) -> str: """ Unless *attr_name* has an on_setattr hook, use normal assignment after conversion. Otherwise relegate to _setattr_with_converter. """ if has_on_setattr: - return _setattr_with_converter(attr_name, value_var, True) + return _setattr_with_converter(attr_name, value_var, True, converter) - return "self.%s = %s(%s)" % ( - attr_name, - _init_converter_pat % (attr_name,), - value_var, - ) + return f"self.{attr_name} = {converter._fmt_converter_call(attr_name, value_var)}" -def _attrs_to_init_script( - attrs, - frozen, - slots, - pre_init, - pre_init_has_args, - post_init, - cache_hash, - base_attr_map, - is_exc, - needs_cached_setattr, - has_cls_on_setattr, - attrs_init, +def _determine_setters( + frozen: bool, slots: bool, base_attr_map: dict[str, type] ): """ - Return a script of an initializer for *attrs* and a dict of globals. + Determine the correct setter functions based on whether a class is frozen + and/or slotted. + """ + if frozen is True: + if slots is True: + return (), _setattr, _setattr_with_converter + + # Dict frozen classes assign directly to __dict__. + # But only if the attribute doesn't come from an ancestor slot + # class. + # Note _inst_dict will be used again below if cache_hash is True + + def fmt_setter( + attr_name: str, value_var: str, has_on_setattr: bool + ) -> str: + if _is_slot_attr(attr_name, base_attr_map): + return _setattr(attr_name, value_var, has_on_setattr) + + return f"_inst_dict['{attr_name}'] = {value_var}" + + def fmt_setter_with_converter( + attr_name: str, + value_var: str, + has_on_setattr: bool, + converter: Converter, + ) -> str: + if has_on_setattr or _is_slot_attr(attr_name, base_attr_map): + return _setattr_with_converter( + attr_name, value_var, has_on_setattr, converter + ) - The globals are expected by the generated script. + return f"_inst_dict['{attr_name}'] = {converter._fmt_converter_call(attr_name, value_var)}" - If *frozen* is True, we cannot set the attributes directly so we use - a cached ``object.__setattr__``. - """ - lines = [] - if pre_init: - lines.append("self.__attrs_pre_init__()") + return ( + ("_inst_dict = self.__dict__",), + fmt_setter, + fmt_setter_with_converter, + ) + + # Not frozen -- we can just assign directly. + return (), _assign, _assign_with_converter + + +def _attrs_to_init_script( + attrs: list[Attribute], + is_frozen: bool, + is_slotted: bool, + call_pre_init: bool, + pre_init_has_args: bool, + call_post_init: bool, + does_cache_hash: bool, + base_attr_map: dict[str, type], + is_exc: bool, + needs_cached_setattr: bool, + has_cls_on_setattr: bool, + method_name: str, +) -> tuple[str, dict, dict]: + """ + Return a script of an initializer for *attrs*, a dict of globals, and + annotations for the initializer. + + The globals are required by the generated script. + """ + lines = ["self.__attrs_pre_init__()"] if call_pre_init else [] if needs_cached_setattr: lines.append( # Circumvent the __setattr__ descriptor to save one lookup per - # assignment. - # Note _setattr will be used again below if cache_hash is True + # assignment. Note _setattr will be used again below if + # does_cache_hash is True. "_setattr = _cached_setattr_get(self)" ) - if frozen is True: - if slots is True: - fmt_setter = _setattr - fmt_setter_with_converter = _setattr_with_converter - else: - # Dict frozen classes assign directly to __dict__. - # But only if the attribute doesn't come from an ancestor slot - # class. - # Note _inst_dict will be used again below if cache_hash is True - lines.append("_inst_dict = self.__dict__") - - def fmt_setter(attr_name, value_var, has_on_setattr): - if _is_slot_attr(attr_name, base_attr_map): - return _setattr(attr_name, value_var, has_on_setattr) - - return f"_inst_dict['{attr_name}'] = {value_var}" - - def fmt_setter_with_converter( - attr_name, value_var, has_on_setattr - ): - if has_on_setattr or _is_slot_attr(attr_name, base_attr_map): - return _setattr_with_converter( - attr_name, value_var, has_on_setattr - ) - - return "_inst_dict['%s'] = %s(%s)" % ( - attr_name, - _init_converter_pat % (attr_name,), - value_var, - ) - - else: - # Not frozen. - fmt_setter = _assign - fmt_setter_with_converter = _assign_with_converter + extra_lines, fmt_setter, fmt_setter_with_converter = _determine_setters( + is_frozen, is_slotted, base_attr_map + ) + lines.extend(extra_lines) args = [] kw_only_args = [] @@ -2328,19 +2150,26 @@ def fmt_setter_with_converter( has_factory = isinstance(a.default, Factory) maybe_self = "self" if has_factory and a.default.takes_self else "" + if a.converter is not None and not isinstance(a.converter, Converter): + converter = Converter(a.converter) + else: + converter = a.converter + if a.init is False: if has_factory: - init_factory_name = _init_factory_pat % (a.name,) - if a.converter is not None: + init_factory_name = _INIT_FACTORY_PAT % (a.name,) + if converter is not None: lines.append( fmt_setter_with_converter( attr_name, init_factory_name + f"({maybe_self})", has_on_setattr, + converter, ) ) - conv_name = _init_converter_pat % (a.name,) - names_for_globals[conv_name] = a.converter + names_for_globals[converter._get_global_name(a.name)] = ( + converter.converter + ) else: lines.append( fmt_setter( @@ -2350,16 +2179,18 @@ def fmt_setter_with_converter( ) ) names_for_globals[init_factory_name] = a.default.factory - elif a.converter is not None: + elif converter is not None: lines.append( fmt_setter_with_converter( attr_name, f"attr_dict['{attr_name}'].default", has_on_setattr, + converter, ) ) - conv_name = _init_converter_pat % (a.name,) - names_for_globals[conv_name] = a.converter + names_for_globals[converter._get_global_name(a.name)] = ( + converter.converter + ) else: lines.append( fmt_setter( @@ -2375,15 +2206,15 @@ def fmt_setter_with_converter( else: args.append(arg) - if a.converter is not None: + if converter is not None: lines.append( fmt_setter_with_converter( - attr_name, arg_name, has_on_setattr + attr_name, arg_name, has_on_setattr, converter ) ) - names_for_globals[ - _init_converter_pat % (a.name,) - ] = a.converter + names_for_globals[converter._get_global_name(a.name)] = ( + converter.converter + ) else: lines.append(fmt_setter(attr_name, arg_name, has_on_setattr)) @@ -2395,12 +2226,12 @@ def fmt_setter_with_converter( args.append(arg) lines.append(f"if {arg_name} is not NOTHING:") - init_factory_name = _init_factory_pat % (a.name,) - if a.converter is not None: + init_factory_name = _INIT_FACTORY_PAT % (a.name,) + if converter is not None: lines.append( " " + fmt_setter_with_converter( - attr_name, arg_name, has_on_setattr + attr_name, arg_name, has_on_setattr, converter ) ) lines.append("else:") @@ -2410,11 +2241,12 @@ def fmt_setter_with_converter( attr_name, init_factory_name + "(" + maybe_self + ")", has_on_setattr, + converter, ) ) - names_for_globals[ - _init_converter_pat % (a.name,) - ] = a.converter + names_for_globals[converter._get_global_name(a.name)] = ( + converter.converter + ) else: lines.append( " " + fmt_setter(attr_name, arg_name, has_on_setattr) @@ -2435,26 +2267,24 @@ def fmt_setter_with_converter( else: args.append(arg_name) - if a.converter is not None: + if converter is not None: lines.append( fmt_setter_with_converter( - attr_name, arg_name, has_on_setattr + attr_name, arg_name, has_on_setattr, converter ) ) - names_for_globals[ - _init_converter_pat % (a.name,) - ] = a.converter + names_for_globals[converter._get_global_name(a.name)] = ( + converter.converter + ) else: lines.append(fmt_setter(attr_name, arg_name, has_on_setattr)) if a.init is True: - if a.type is not None and a.converter is None: + if a.type is not None and converter is None: annotations[arg_name] = a.type - elif a.converter is not None: - # Try to get the type from the converter. - t = _AnnotationExtractor(a.converter).get_first_param_type() - if t: - annotations[arg_name] = t + elif converter is not None and converter._first_param_type: + # Use the type from the converter if present. + annotations[arg_name] = converter._first_param_type if attrs_to_validate: # we can skip this if there are no validators. names_for_globals["_config"] = _config @@ -2466,25 +2296,23 @@ def fmt_setter_with_converter( names_for_globals[val_name] = a.validator names_for_globals[attr_name] = a - if post_init: + if call_post_init: lines.append("self.__attrs_post_init__()") - # because this is set only after __attrs_post_init__ is called, a crash + # Because this is set only after __attrs_post_init__ is called, a crash # will result if post-init tries to access the hash code. This seemed - # preferable to setting this beforehand, in which case alteration to - # field values during post-init combined with post-init accessing the - # hash code would result in silent bugs. - if cache_hash: - if frozen: - if slots: # noqa: SIM108 - # if frozen and slots, then _setattr defined above - init_hash_cache = "_setattr('%s', %s)" + # preferable to setting this beforehand, in which case alteration to field + # values during post-init combined with post-init accessing the hash code + # would result in silent bugs. + if does_cache_hash: + if is_frozen: + if is_slotted: + init_hash_cache = f"_setattr('{_HASH_CACHE_FIELD}', None)" else: - # if frozen and not slots, then _inst_dict defined above - init_hash_cache = "_inst_dict['%s'] = %s" + init_hash_cache = f"_inst_dict['{_HASH_CACHE_FIELD}'] = None" else: - init_hash_cache = "self.%s = %s" - lines.append(init_hash_cache % (_hash_cache_field, "None")) + init_hash_cache = f"self.{_HASH_CACHE_FIELD} = None" + lines.append(init_hash_cache) # For exceptions we rely on BaseException.__init__ for proper # initialization. @@ -2496,29 +2324,28 @@ def fmt_setter_with_converter( args = ", ".join(args) pre_init_args = args if kw_only_args: - args += "%s*, %s" % ( - ", " if args else "", # leading comma - ", ".join(kw_only_args), # kw_only args - ) + # leading comma & kw_only args + args += f"{', ' if args else ''}*, {', '.join(kw_only_args)}" pre_init_kw_only_args = ", ".join( - ["%s=%s" % (kw_arg, kw_arg) for kw_arg in kw_only_args] + [ + f"{kw_arg_name}={kw_arg_name}" + # We need to remove the defaults from the kw_only_args. + for kw_arg_name in (kwa.split("=")[0] for kwa in kw_only_args) + ] ) - pre_init_args += ( - ", " if pre_init_args else "" - ) # handle only kwargs and no regular args + pre_init_args += ", " if pre_init_args else "" pre_init_args += pre_init_kw_only_args - if pre_init and pre_init_has_args: - # If pre init method has arguments, pass same arguments as `__init__` - lines[0] = "self.__attrs_pre_init__(%s)" % pre_init_args + if call_pre_init and pre_init_has_args: + # If pre init method has arguments, pass same arguments as `__init__`. + lines[0] = f"self.__attrs_pre_init__({pre_init_args})" + # Python <3.12 doesn't allow backslashes in f-strings. + NL = "\n " return ( - "def %s(self, %s):\n %s\n" - % ( - ("__attrs_init__" if attrs_init else "__init__"), - args, - "\n ".join(lines) if lines else "pass", - ), + f"""def {method_name}(self, {args}): + {NL.join(lines) if lines else "pass"} +""", names_for_globals, annotations, ) @@ -2543,20 +2370,19 @@ class Attribute: You should never instantiate this class yourself. - The class has *all* arguments of `attr.ib` (except for ``factory`` - which is only syntactic sugar for ``default=Factory(...)`` plus the - following: + The class has *all* arguments of `attr.ib` (except for ``factory`` which is + only syntactic sugar for ``default=Factory(...)`` plus the following: - ``name`` (`str`): The name of the attribute. - ``alias`` (`str`): The __init__ parameter name of the attribute, after any explicit overrides and default private-attribute-name handling. - ``inherited`` (`bool`): Whether or not that attribute has been inherited from a base class. - - ``eq_key`` and ``order_key`` (`typing.Callable` or `None`): The callables - that are used for comparing and ordering objects by this attribute, - respectively. These are set by passing a callable to `attr.ib`'s ``eq``, - ``order``, or ``cmp`` arguments. See also :ref:`comparison customization - `. + - ``eq_key`` and ``order_key`` (`typing.Callable` or `None`): The + callables that are used for comparing and ordering objects by this + attribute, respectively. These are set by passing a callable to + `attr.ib`'s ``eq``, ``order``, or ``cmp`` arguments. See also + :ref:`comparison customization `. Instances of this class are frequently used for introspection purposes like: @@ -2579,7 +2405,9 @@ class Attribute: For the full version history of the fields, see `attr.ib`. """ - __slots__ = ( + # These slots must NOT be reordered because we use them later for + # instantiation. + __slots__ = ( # noqa: RUF023 "name", "default", "validator", @@ -2625,7 +2453,7 @@ def __init__( ) # Cache this descriptor here to speed things up later. - bound_setattr = _obj_setattr.__get__(self) + bound_setattr = _OBJ_SETATTR.__get__(self) # Despite the big red warning, people *do* instantiate `Attribute` # themselves. @@ -2645,7 +2473,7 @@ def __init__( ( types.MappingProxyType(dict(metadata)) # Shallow copy if metadata - else _empty_metadata_singleton + else _EMPTY_METADATA_SINGLETON ), ) bound_setattr("type", type) @@ -2655,36 +2483,35 @@ def __init__( bound_setattr("alias", alias) def __setattr__(self, name, value): - raise FrozenInstanceError() + raise FrozenInstanceError @classmethod - def from_counting_attr(cls, name, ca, type=None): + def from_counting_attr(cls, name: str, ca: _CountingAttr, type=None): # type holds the annotated value. deal with conflicts: if type is None: type = ca.type elif ca.type is not None: - msg = "Type annotation and type argument cannot both be present" + msg = f"Type annotation and type argument cannot both be present for '{name}'." raise ValueError(msg) - inst_dict = { - k: getattr(ca, k) - for k in Attribute.__slots__ - if k - not in ( - "name", - "validator", - "default", - "type", - "inherited", - ) # exclude methods and deprecated alias - } return cls( - name=name, - validator=ca._validator, - default=ca._default, - type=type, - cmp=None, - inherited=False, - **inst_dict, + name, + ca._default, + ca._validator, + ca.repr, + None, + ca.hash, + ca.init, + False, + ca.metadata, + type, + ca.converter, + ca.kw_only, + ca.eq, + ca.eq_key, + ca.order, + ca.order_key, + ca.on_setattr, + ca.alias, ) # Don't use attrs.evolve since fields(Attribute) doesn't work @@ -2693,7 +2520,7 @@ def evolve(self, **changes): Copy *self* and apply *changes*. This works similarly to `attrs.evolve` but that function does not work - with `Attribute`. + with :class:`attrs.Attribute`. It is mainly meant to be used for `transform-fields`. @@ -2722,16 +2549,18 @@ def __setstate__(self, state): self._setattrs(zip(self.__slots__, state)) def _setattrs(self, name_values_pairs): - bound_setattr = _obj_setattr.__get__(self) + bound_setattr = _OBJ_SETATTR.__get__(self) for name, value in name_values_pairs: if name != "metadata": bound_setattr(name, value) else: bound_setattr( name, - types.MappingProxyType(dict(value)) - if value - else _empty_metadata_singleton, + ( + types.MappingProxyType(dict(value)) + if value + else _EMPTY_METADATA_SINGLETON + ), ) @@ -2771,22 +2600,22 @@ class _CountingAttr: """ __slots__ = ( - "counter", "_default", - "repr", + "_validator", + "alias", + "converter", + "counter", "eq", "eq_key", - "order", - "order_key", "hash", "init", - "metadata", - "_validator", - "converter", - "type", "kw_only", + "metadata", "on_setattr", - "alias", + "order", + "order_key", + "repr", + "type", ) __attrs_attrs__ = ( *tuple( @@ -2896,12 +2725,13 @@ def default(self, meth): Returns *meth* unchanged. - :raises DefaultAlreadySetError: If default has been set before. + Raises: + DefaultAlreadySetError: If default has been set before. .. versionadded:: 17.1.0 """ if self._default is not NOTHING: - raise DefaultAlreadySetError() + raise DefaultAlreadySetError self._default = Factory(meth, takes_self=True) @@ -2918,10 +2748,14 @@ class Factory: If passed as the default value to `attrs.field`, the factory is used to generate a new value. - :param callable factory: A callable that takes either none or exactly one - mandatory positional argument depending on *takes_self*. - :param bool takes_self: Pass the partially initialized instance that is - being initialized as a positional argument. + Args: + factory (typing.Callable): + A callable that takes either none or exactly one mandatory + positional argument depending on *takes_self*. + + takes_self (bool): + Pass the partially initialized instance that is being initialized + as a positional argument. .. versionadded:: 17.1.0 *takes_self* """ @@ -2965,35 +2799,177 @@ def __setstate__(self, state): Factory = _add_hash(_add_eq(_add_repr(Factory, attrs=_f), attrs=_f), attrs=_f) +class Converter: + """ + Stores a converter callable. + + Allows for the wrapped converter to take additional arguments. The + arguments are passed in the order they are documented. + + Args: + converter (Callable): A callable that converts the passed value. + + takes_self (bool): + Pass the partially initialized instance that is being initialized + as a positional argument. (default: `False`) + + takes_field (bool): + Pass the field definition (an :class:`Attribute`) into the + converter as a positional argument. (default: `False`) + + .. versionadded:: 24.1.0 + """ + + __slots__ = ( + "__call__", + "_first_param_type", + "_global_name", + "converter", + "takes_field", + "takes_self", + ) + + def __init__(self, converter, *, takes_self=False, takes_field=False): + self.converter = converter + self.takes_self = takes_self + self.takes_field = takes_field + + ex = _AnnotationExtractor(converter) + self._first_param_type = ex.get_first_param_type() + + if not (self.takes_self or self.takes_field): + self.__call__ = lambda value, _, __: self.converter(value) + elif self.takes_self and not self.takes_field: + self.__call__ = lambda value, instance, __: self.converter( + value, instance + ) + elif not self.takes_self and self.takes_field: + self.__call__ = lambda value, __, field: self.converter( + value, field + ) + else: + self.__call__ = lambda value, instance, field: self.converter( + value, instance, field + ) + + rt = ex.get_return_type() + if rt is not None: + self.__call__.__annotations__["return"] = rt + + @staticmethod + def _get_global_name(attr_name: str) -> str: + """ + Return the name that a converter for an attribute name *attr_name* + would have. + """ + return f"__attr_converter_{attr_name}" + + def _fmt_converter_call(self, attr_name: str, value_var: str) -> str: + """ + Return a string that calls the converter for an attribute name + *attr_name* and the value in variable named *value_var* according to + `self.takes_self` and `self.takes_field`. + """ + if not (self.takes_self or self.takes_field): + return f"{self._get_global_name(attr_name)}({value_var})" + + if self.takes_self and self.takes_field: + return f"{self._get_global_name(attr_name)}({value_var}, self, attr_dict['{attr_name}'])" + + if self.takes_self: + return f"{self._get_global_name(attr_name)}({value_var}, self)" + + return f"{self._get_global_name(attr_name)}({value_var}, attr_dict['{attr_name}'])" + + def __getstate__(self): + """ + Return a dict containing only converter and takes_self -- the rest gets + computed when loading. + """ + return { + "converter": self.converter, + "takes_self": self.takes_self, + "takes_field": self.takes_field, + } + + def __setstate__(self, state): + """ + Load instance from state. + """ + self.__init__(**state) + + +_f = [ + Attribute( + name=name, + default=NOTHING, + validator=None, + repr=True, + cmp=None, + eq=True, + order=False, + hash=True, + init=True, + inherited=False, + ) + for name in ("converter", "takes_self", "takes_field") +] + +Converter = _add_hash( + _add_eq(_add_repr(Converter, attrs=_f), attrs=_f), attrs=_f +) + + def make_class( name, attrs, bases=(object,), class_body=None, **attributes_arguments ): r""" A quick way to create a new class called *name* with *attrs*. - :param str name: The name for the new class. + .. note:: + + ``make_class()`` is a thin wrapper around `attr.s`, not `attrs.define` + which means that it doesn't come with some of the improved defaults. + + For example, if you want the same ``on_setattr`` behavior as in + `attrs.define`, you have to pass the hooks yourself: ``make_class(..., + on_setattr=setters.pipe(setters.convert, setters.validate)`` - :param attrs: A list of names or a dictionary of mappings of names to - `attr.ib`\ s / `attrs.field`\ s. + .. warning:: + + It is *your* duty to ensure that the class name and the attribute names + are valid identifiers. ``make_class()`` will *not* validate them for + you. + + Args: + name (str): The name for the new class. - The order is deduced from the order of the names or attributes inside - *attrs*. Otherwise the order of the definition of the attributes is - used. - :type attrs: `list` or `dict` + attrs (list | dict): + A list of names or a dictionary of mappings of names to `attr.ib`\ + s / `attrs.field`\ s. - :param tuple bases: Classes that the new class will subclass. + The order is deduced from the order of the names or attributes + inside *attrs*. Otherwise the order of the definition of the + attributes is used. - :param dict class_body: An optional dictionary of class attributes for the new class. + bases (tuple[type, ...]): Classes that the new class will subclass. - :param attributes_arguments: Passed unmodified to `attr.s`. + class_body (dict): + An optional dictionary of class attributes for the new class. - :return: A new class with *attrs*. - :rtype: type + attributes_arguments: Passed unmodified to `attr.s`. + + Returns: + type: A new class with *attrs*. .. versionadded:: 17.1.0 *bases* .. versionchanged:: 18.1.0 If *attrs* is ordered, the order is retained. .. versionchanged:: 23.2.0 *class_body* + .. versionchanged:: 25.2.0 Class names can now be unicode. """ + # Class identifiers are converted into the normal form NFKC while parsing + name = unicodedata.normalize("NFKC", name) + if isinstance(attrs, dict): cls_dict = attrs elif isinstance(attrs, (list, tuple)): @@ -3039,14 +3015,19 @@ def make_class( True, ) - return _attrs(these=cls_dict, **attributes_arguments)(type_) + cls = _attrs(these=cls_dict, **attributes_arguments)(type_) + # Only add type annotations now or "_attrs()" will complain: + cls.__annotations__ = { + k: v.type for k, v in cls_dict.items() if v.type is not None + } + return cls # These are required by within this module so we define them here and merely # import into .validators / .converters. -@attrs(slots=True, hash=True) +@attrs(slots=True, unsafe_hash=True) class _AndValidator: """ Compose many validators to a single one. @@ -3065,7 +3046,9 @@ def and_(*validators): When called on a value, it runs all wrapped validators. - :param callables validators: Arbitrary number of validators. + Args: + validators (~collections.abc.Iterable[typing.Callable]): + Arbitrary number of validators. .. versionadded:: 17.1.0 """ @@ -3087,33 +3070,54 @@ def pipe(*converters): When called on a value, it runs all wrapped converters, returning the *last* value. - Type annotations will be inferred from the wrapped converters', if - they have any. + Type annotations will be inferred from the wrapped converters', if they + have any. - :param callables converters: Arbitrary number of converters. + converters (~collections.abc.Iterable[typing.Callable]): + Arbitrary number of converters. .. versionadded:: 20.1.0 """ - def pipe_converter(val): - for converter in converters: - val = converter(val) + return_instance = any(isinstance(c, Converter) for c in converters) + + if return_instance: + + def pipe_converter(val, inst, field): + for c in converters: + val = ( + c(val, inst, field) if isinstance(c, Converter) else c(val) + ) + + return val + + else: - return val + def pipe_converter(val): + for c in converters: + val = c(val) + + return val if not converters: # If the converter list is empty, pipe_converter is the identity. - A = typing.TypeVar("A") - pipe_converter.__annotations__ = {"val": A, "return": A} + A = TypeVar("A") + pipe_converter.__annotations__.update({"val": A, "return": A}) else: # Get parameter type from first converter. t = _AnnotationExtractor(converters[0]).get_first_param_type() if t: pipe_converter.__annotations__["val"] = t + last = converters[-1] + if not PY_3_11_PLUS and isinstance(last, Converter): + last = last.__call__ + # Get return type from last converter. - rt = _AnnotationExtractor(converters[-1]).get_return_type() + rt = _AnnotationExtractor(last).get_return_type() if rt: pipe_converter.__annotations__["return"] = rt + if return_instance: + return Converter(pipe_converter, takes_self=True, takes_field=True) return pipe_converter diff --git a/tools/third_party/attrs/src/attr/_next_gen.py b/tools/third_party/attrs/src/attr/_next_gen.py index 1fb9f259b53b85..9290664b2dca92 100644 --- a/tools/third_party/attrs/src/attr/_next_gen.py +++ b/tools/third_party/attrs/src/attr/_next_gen.py @@ -5,16 +5,15 @@ default values. """ - from functools import partial from . import setters from ._funcs import asdict as _asdict from ._funcs import astuple as _astuple from ._make import ( + _DEFAULT_ON_SETATTR, NOTHING, _frozen_setattrs, - _ng_default_on_setattr, attrib, attrs, ) @@ -46,44 +45,301 @@ def define( match_args=True, ): r""" - Define an *attrs* class. + A class decorator that adds :term:`dunder methods` according to + :term:`fields ` specified using :doc:`type annotations `, + `field()` calls, or the *these* argument. + + Since *attrs* patches or replaces an existing class, you cannot use + `object.__init_subclass__` with *attrs* classes, because it runs too early. + As a replacement, you can define ``__attrs_init_subclass__`` on your class. + It will be called by *attrs* classes that subclass it after they're + created. See also :ref:`init-subclass`. + + Args: + slots (bool): + Create a :term:`slotted class ` that's more + memory-efficient. Slotted classes are generally superior to the + default dict classes, but have some gotchas you should know about, + so we encourage you to read the :term:`glossary entry `. + + auto_detect (bool): + Instead of setting the *init*, *repr*, *eq*, and *hash* arguments + explicitly, assume they are set to True **unless any** of the + involved methods for one of the arguments is implemented in the + *current* class (meaning, it is *not* inherited from some base + class). + + So, for example by implementing ``__eq__`` on a class yourself, + *attrs* will deduce ``eq=False`` and will create *neither* + ``__eq__`` *nor* ``__ne__`` (but Python classes come with a + sensible ``__ne__`` by default, so it *should* be enough to only + implement ``__eq__`` in most cases). + + Passing True or False` to *init*, *repr*, *eq*, or *hash* + overrides whatever *auto_detect* would determine. + + auto_exc (bool): + If the class subclasses `BaseException` (which implicitly includes + any subclass of any exception), the following happens to behave + like a well-behaved Python exception class: + + - the values for *eq*, *order*, and *hash* are ignored and the + instances compare and hash by the instance's ids [#]_ , + - all attributes that are either passed into ``__init__`` or have a + default value are additionally available as a tuple in the + ``args`` attribute, + - the value of *str* is ignored leaving ``__str__`` to base + classes. + + .. [#] + Note that *attrs* will *not* remove existing implementations of + ``__hash__`` or the equality methods. It just won't add own + ones. + + on_setattr (~typing.Callable | list[~typing.Callable] | None | ~typing.Literal[attrs.setters.NO_OP]): + A callable that is run whenever the user attempts to set an + attribute (either by assignment like ``i.x = 42`` or by using + `setattr` like ``setattr(i, "x", 42)``). It receives the same + arguments as validators: the instance, the attribute that is being + modified, and the new value. + + If no exception is raised, the attribute is set to the return value + of the callable. + + If a list of callables is passed, they're automatically wrapped in + an `attrs.setters.pipe`. + + If left None, the default behavior is to run converters and + validators whenever an attribute is set. + + init (bool): + Create a ``__init__`` method that initializes the *attrs* + attributes. Leading underscores are stripped for the argument name, + unless an alias is set on the attribute. + + .. seealso:: + `init` shows advanced ways to customize the generated + ``__init__`` method, including executing code before and after. + + repr(bool): + Create a ``__repr__`` method with a human readable representation + of *attrs* attributes. + + str (bool): + Create a ``__str__`` method that is identical to ``__repr__``. This + is usually not necessary except for `Exception`\ s. + + eq (bool | None): + If True or None (default), add ``__eq__`` and ``__ne__`` methods + that check two instances for equality. + + .. seealso:: + `comparison` describes how to customize the comparison behavior + going as far comparing NumPy arrays. + + order (bool | None): + If True, add ``__lt__``, ``__le__``, ``__gt__``, and ``__ge__`` + methods that behave like *eq* above and allow instances to be + ordered. + + They compare the instances as if they were tuples of their *attrs* + attributes if and only if the types of both classes are + *identical*. + + If `None` mirror value of *eq*. + + .. seealso:: `comparison` + + unsafe_hash (bool | None): + If None (default), the ``__hash__`` method is generated according + how *eq* and *frozen* are set. + + 1. If *both* are True, *attrs* will generate a ``__hash__`` for + you. + 2. If *eq* is True and *frozen* is False, ``__hash__`` will be set + to None, marking it unhashable (which it is). + 3. If *eq* is False, ``__hash__`` will be left untouched meaning + the ``__hash__`` method of the base class will be used. If the + base class is `object`, this means it will fall back to id-based + hashing. + + Although not recommended, you can decide for yourself and force + *attrs* to create one (for example, if the class is immutable even + though you didn't freeze it programmatically) by passing True or + not. Both of these cases are rather special and should be used + carefully. + + .. seealso:: + + - Our documentation on `hashing`, + - Python's documentation on `object.__hash__`, + - and the `GitHub issue that led to the default \ behavior + `_ for more + details. + + hash (bool | None): + Deprecated alias for *unsafe_hash*. *unsafe_hash* takes precedence. + + cache_hash (bool): + Ensure that the object's hash code is computed only once and stored + on the object. If this is set to True, hashing must be either + explicitly or implicitly enabled for this class. If the hash code + is cached, avoid any reassignments of fields involved in hash code + computation or mutations of the objects those fields point to after + object creation. If such changes occur, the behavior of the + object's hash code is undefined. + + frozen (bool): + Make instances immutable after initialization. If someone attempts + to modify a frozen instance, `attrs.exceptions.FrozenInstanceError` + is raised. + + .. note:: + + 1. This is achieved by installing a custom ``__setattr__`` + method on your class, so you can't implement your own. + + 2. True immutability is impossible in Python. + + 3. This *does* have a minor a runtime performance `impact + ` when initializing new instances. In other + words: ``__init__`` is slightly slower with ``frozen=True``. + + 4. If a class is frozen, you cannot modify ``self`` in + ``__attrs_post_init__`` or a self-written ``__init__``. You + can circumvent that limitation by using + ``object.__setattr__(self, "attribute_name", value)``. + + 5. Subclasses of a frozen class are frozen too. + + kw_only (bool): + Make all attributes keyword-only in the generated ``__init__`` (if + *init* is False, this parameter is ignored). + + weakref_slot (bool): + Make instances weak-referenceable. This has no effect unless + *slots* is True. + + field_transformer (~typing.Callable | None): + A function that is called with the original class object and all + fields right before *attrs* finalizes the class. You can use this, + for example, to automatically add converters or validators to + fields based on their types. + + .. seealso:: `transform-fields` + + match_args (bool): + If True (default), set ``__match_args__`` on the class to support + :pep:`634` (*Structural Pattern Matching*). It is a tuple of all + non-keyword-only ``__init__`` parameter names on Python 3.10 and + later. Ignored on older Python versions. + + collect_by_mro (bool): + If True, *attrs* collects attributes from base classes correctly + according to the `method resolution order + `_. If False, *attrs* + will mimic the (wrong) behavior of `dataclasses` and :pep:`681`. + + See also `issue #428 + `_. + + getstate_setstate (bool | None): + .. note:: + + This is usually only interesting for slotted classes and you + should probably just set *auto_detect* to True. + + If True, ``__getstate__`` and ``__setstate__`` are generated and + attached to the class. This is necessary for slotted classes to be + pickleable. If left None, it's True by default for slotted classes + and False for dict classes. + + If *auto_detect* is True, and *getstate_setstate* is left None, and + **either** ``__getstate__`` or ``__setstate__`` is detected + directly on the class (meaning: not inherited), it is set to False + (this is usually what you want). + + auto_attribs (bool | None): + If True, look at type annotations to determine which attributes to + use, like `dataclasses`. If False, it will only look for explicit + :func:`field` class attributes, like classic *attrs*. + + If left None, it will guess: - Differences to the classic `attr.s` that it uses underneath: + 1. If any attributes are annotated and no unannotated + `attrs.field`\ s are found, it assumes *auto_attribs=True*. + 2. Otherwise it assumes *auto_attribs=False* and tries to collect + `attrs.field`\ s. - - Automatically detect whether or not *auto_attribs* should be `True` (c.f. - *auto_attribs* parameter). - - Converters and validators run when attributes are set by default -- if - *frozen* is `False`. - - *slots=True* + If *attrs* decides to look at type annotations, **all** fields + **must** be annotated. If *attrs* encounters a field that is set to + a :func:`field` / `attr.ib` but lacks a type annotation, an + `attrs.exceptions.UnannotatedAttributeError` is raised. Use + ``field_name: typing.Any = field(...)`` if you don't want to set a + type. - .. caution:: + .. warning:: - Usually this has only upsides and few visible effects in everyday - programming. But it *can* lead to some surprising behaviors, so please - make sure to read :term:`slotted classes`. - - *auto_exc=True* - - *auto_detect=True* - - *order=False* - - Some options that were only relevant on Python 2 or were kept around for - backwards-compatibility have been removed. + For features that use the attribute name to create decorators + (for example, :ref:`validators `), you still *must* + assign :func:`field` / `attr.ib` to them. Otherwise Python will + either not find the name or try to use the default value to + call, for example, ``validator`` on it. - Please note that these are all defaults and you can change them as you - wish. + Attributes annotated as `typing.ClassVar`, and attributes that are + neither annotated nor set to an `field()` are **ignored**. - :param Optional[bool] auto_attribs: If set to `True` or `False`, it behaves - exactly like `attr.s`. If left `None`, `attr.s` will try to guess: + these (dict[str, object]): + A dictionary of name to the (private) return value of `field()` + mappings. This is useful to avoid the definition of your attributes + within the class body because you can't (for example, if you want + to add ``__repr__`` methods to Django models) or don't want to. - 1. If any attributes are annotated and no unannotated `attrs.fields`\ s - are found, it assumes *auto_attribs=True*. - 2. Otherwise it assumes *auto_attribs=False* and tries to collect - `attrs.fields`\ s. + If *these* is not `None`, *attrs* will *not* search the class body + for attributes and will *not* remove any attributes from it. - For now, please refer to `attr.s` for the rest of the parameters. + The order is deduced from the order of the attributes inside + *these*. + + Arguably, this is a rather obscure feature. .. versionadded:: 20.1.0 .. versionchanged:: 21.3.0 Converters are also run ``on_setattr``. .. versionadded:: 22.2.0 *unsafe_hash* as an alias for *hash* (for :pep:`681` compliance). + .. versionchanged:: 24.1.0 + Instances are not compared as tuples of attributes anymore, but using a + big ``and`` condition. This is faster and has more correct behavior for + uncomparable values like `math.nan`. + .. versionadded:: 24.1.0 + If a class has an *inherited* classmethod called + ``__attrs_init_subclass__``, it is executed after the class is created. + .. deprecated:: 24.1.0 *hash* is deprecated in favor of *unsafe_hash*. + .. versionadded:: 24.3.0 + Unless already present, a ``__replace__`` method is automatically + created for `copy.replace` (Python 3.13+ only). + + .. note:: + + The main differences to the classic `attr.s` are: + + - Automatically detect whether or not *auto_attribs* should be `True` + (c.f. *auto_attribs* parameter). + - Converters and validators run when attributes are set by default -- + if *frozen* is `False`. + - *slots=True* + + Usually, this has only upsides and few visible effects in everyday + programming. But it *can* lead to some surprising behaviors, so + please make sure to read :term:`slotted classes`. + + - *auto_exc=True* + - *auto_detect=True* + - *order=False* + - Some options that were only relevant on Python 2 or were kept around + for backwards-compatibility have been removed. + """ def do_it(cls, auto_attribs): @@ -124,7 +380,7 @@ def wrap(cls): # By default, mutable classes convert & validate on setattr. if frozen is False and on_setattr is None: - on_setattr = _ng_default_on_setattr + on_setattr = _DEFAULT_ON_SETATTR # However, if we subclass a frozen class, we inherit the immutability # and disable on_setattr. @@ -146,7 +402,7 @@ def wrap(cls): return do_it(cls, False) # maybe_cls's type depends on the usage of the decorator. It's a class - # if it's used as `@attrs` but ``None`` if used as `@attrs()`. + # if it's used as `@attrs` but `None` if used as `@attrs()`. if maybe_cls is None: return wrap @@ -175,13 +431,151 @@ def field( alias=None, ): """ - Identical to `attr.ib`, except keyword-only and with some arguments - removed. + Create a new :term:`field` / :term:`attribute` on a class. + + .. warning:: + + Does **nothing** unless the class is also decorated with + `attrs.define` (or similar)! + + Args: + default: + A value that is used if an *attrs*-generated ``__init__`` is used + and no value is passed while instantiating or the attribute is + excluded using ``init=False``. + + If the value is an instance of `attrs.Factory`, its callable will + be used to construct a new value (useful for mutable data types + like lists or dicts). + + If a default is not set (or set manually to `attrs.NOTHING`), a + value *must* be supplied when instantiating; otherwise a + `TypeError` will be raised. + + .. seealso:: `defaults` + + factory (~typing.Callable): + Syntactic sugar for ``default=attr.Factory(factory)``. + + validator (~typing.Callable | list[~typing.Callable]): + Callable that is called by *attrs*-generated ``__init__`` methods + after the instance has been initialized. They receive the + initialized instance, the :func:`~attrs.Attribute`, and the passed + value. + + The return value is *not* inspected so the validator has to throw + an exception itself. + + If a `list` is passed, its items are treated as validators and must + all pass. + + Validators can be globally disabled and re-enabled using + `attrs.validators.get_disabled` / `attrs.validators.set_disabled`. + + The validator can also be set using decorator notation as shown + below. + + .. seealso:: :ref:`validators` + + repr (bool | ~typing.Callable): + Include this attribute in the generated ``__repr__`` method. If + True, include the attribute; if False, omit it. By default, the + built-in ``repr()`` function is used. To override how the attribute + value is formatted, pass a ``callable`` that takes a single value + and returns a string. Note that the resulting string is used as-is, + which means it will be used directly *instead* of calling + ``repr()`` (the default). + + eq (bool | ~typing.Callable): + If True (default), include this attribute in the generated + ``__eq__`` and ``__ne__`` methods that check two instances for + equality. To override how the attribute value is compared, pass a + callable that takes a single value and returns the value to be + compared. + + .. seealso:: `comparison` + + order (bool | ~typing.Callable): + If True (default), include this attributes in the generated + ``__lt__``, ``__le__``, ``__gt__`` and ``__ge__`` methods. To + override how the attribute value is ordered, pass a callable that + takes a single value and returns the value to be ordered. + .. seealso:: `comparison` + + hash (bool | None): + Include this attribute in the generated ``__hash__`` method. If + None (default), mirror *eq*'s value. This is the correct behavior + according the Python spec. Setting this value to anything else + than None is *discouraged*. + + .. seealso:: `hashing` + + init (bool): + Include this attribute in the generated ``__init__`` method. + + It is possible to set this to False and set a default value. In + that case this attributed is unconditionally initialized with the + specified default value or factory. + + .. seealso:: `init` + + converter (typing.Callable | Converter): + A callable that is called by *attrs*-generated ``__init__`` methods + to convert attribute's value to the desired format. + + If a vanilla callable is passed, it is given the passed-in value as + the only positional argument. It is possible to receive additional + arguments by wrapping the callable in a `Converter`. + + Either way, the returned value will be used as the new value of the + attribute. The value is converted before being passed to the + validator, if any. + + .. seealso:: :ref:`converters` + + metadata (dict | None): + An arbitrary mapping, to be used by third-party code. + + .. seealso:: `extending-metadata`. + + type (type): + The type of the attribute. Nowadays, the preferred method to + specify the type is using a variable annotation (see :pep:`526`). + This argument is provided for backwards-compatibility and for usage + with `make_class`. Regardless of the approach used, the type will + be stored on ``Attribute.type``. + + Please note that *attrs* doesn't do anything with this metadata by + itself. You can use it as part of your own code or for `static type + checking `. + + kw_only (bool): + Make this attribute keyword-only in the generated ``__init__`` (if + ``init`` is False, this parameter is ignored). + + on_setattr (~typing.Callable | list[~typing.Callable] | None | ~typing.Literal[attrs.setters.NO_OP]): + Allows to overwrite the *on_setattr* setting from `attr.s`. If left + None, the *on_setattr* value from `attr.s` is used. Set to + `attrs.setters.NO_OP` to run **no** `setattr` hooks for this + attribute -- regardless of the setting in `define()`. + + alias (str | None): + Override this attribute's parameter name in the generated + ``__init__`` method. If left None, default to ``name`` stripped + of leading underscores. See `private-attributes`. + + .. versionadded:: 20.1.0 + .. versionchanged:: 21.1.0 + *eq*, *order*, and *cmp* also accept a custom callable + .. versionadded:: 22.2.0 *alias* .. versionadded:: 23.1.0 The *type* parameter has been re-added; mostly for `attrs.make_class`. Please note that type checkers ignore this metadata. - .. versionadded:: 20.1.0 + + .. seealso:: + + `attr.ib` """ return attrib( default=default, diff --git a/tools/third_party/attrs/src/attr/converters.py b/tools/third_party/attrs/src/attr/converters.py index 2bf4c902a66fae..0a79deef04282f 100644 --- a/tools/third_party/attrs/src/attr/converters.py +++ b/tools/third_party/attrs/src/attr/converters.py @@ -4,11 +4,10 @@ Commonly useful converters. """ - import typing from ._compat import _AnnotationExtractor -from ._make import NOTHING, Factory, pipe +from ._make import NOTHING, Converter, Factory, pipe __all__ = [ @@ -22,21 +21,31 @@ def optional(converter): """ A converter that allows an attribute to be optional. An optional attribute - is one which can be set to ``None``. + is one which can be set to `None`. - Type annotations will be inferred from the wrapped converter's, if it - has any. + Type annotations will be inferred from the wrapped converter's, if it has + any. - :param callable converter: the converter that is used for non-``None`` - values. + Args: + converter (typing.Callable): + the converter that is used for non-`None` values. .. versionadded:: 17.1.0 """ - def optional_converter(val): - if val is None: - return None - return converter(val) + if isinstance(converter, Converter): + + def optional_converter(val, inst, field): + if val is None: + return None + return converter(val, inst, field) + + else: + + def optional_converter(val): + if val is None: + return None + return converter(val) xtr = _AnnotationExtractor(converter) @@ -48,24 +57,35 @@ def optional_converter(val): if rt: optional_converter.__annotations__["return"] = typing.Optional[rt] + if isinstance(converter, Converter): + return Converter(optional_converter, takes_self=True, takes_field=True) + return optional_converter def default_if_none(default=NOTHING, factory=None): """ - A converter that allows to replace ``None`` values by *default* or the - result of *factory*. + A converter that allows to replace `None` values by *default* or the result + of *factory*. - :param default: Value to be used if ``None`` is passed. Passing an instance - of `attrs.Factory` is supported, however the ``takes_self`` option - is *not*. - :param callable factory: A callable that takes no parameters whose result - is used if ``None`` is passed. + Args: + default: + Value to be used if `None` is passed. Passing an instance of + `attrs.Factory` is supported, however the ``takes_self`` option is + *not*. - :raises TypeError: If **neither** *default* or *factory* is passed. - :raises TypeError: If **both** *default* and *factory* are passed. - :raises ValueError: If an instance of `attrs.Factory` is passed with - ``takes_self=True``. + factory (typing.Callable): + A callable that takes no parameters whose result is used if `None` + is passed. + + Raises: + TypeError: If **neither** *default* or *factory* is passed. + + TypeError: If **both** *default* and *factory* are passed. + + ValueError: + If an instance of `attrs.Factory` is passed with + ``takes_self=True``. .. versionadded:: 18.2.0 """ @@ -104,41 +124,39 @@ def default_if_none_converter(val): def to_bool(val): """ - Convert "boolean" strings (e.g., from env. vars.) to real booleans. + Convert "boolean" strings (for example, from environment variables) to real + booleans. - Values mapping to :code:`True`: + Values mapping to `True`: - - :code:`True` - - :code:`"true"` / :code:`"t"` - - :code:`"yes"` / :code:`"y"` - - :code:`"on"` - - :code:`"1"` - - :code:`1` + - ``True`` + - ``"true"`` / ``"t"`` + - ``"yes"`` / ``"y"`` + - ``"on"`` + - ``"1"`` + - ``1`` - Values mapping to :code:`False`: + Values mapping to `False`: - - :code:`False` - - :code:`"false"` / :code:`"f"` - - :code:`"no"` / :code:`"n"` - - :code:`"off"` - - :code:`"0"` - - :code:`0` + - ``False`` + - ``"false"`` / ``"f"`` + - ``"no"`` / ``"n"`` + - ``"off"`` + - ``"0"`` + - ``0`` - :raises ValueError: for any other value. + Raises: + ValueError: For any other value. .. versionadded:: 21.3.0 """ if isinstance(val, str): val = val.lower() - truthy = {True, "true", "t", "yes", "y", "on", "1", 1} - falsy = {False, "false", "f", "no", "n", "off", "0", 0} - try: - if val in truthy: - return True - if val in falsy: - return False - except TypeError: - # Raised when "val" is not hashable (e.g., lists) - pass - msg = f"Cannot convert value to bool: {val}" + + if val in (True, "true", "t", "yes", "y", "on", "1", 1): + return True + if val in (False, "false", "f", "no", "n", "off", "0", 0): + return False + + msg = f"Cannot convert value to bool: {val!r}" raise ValueError(msg) diff --git a/tools/third_party/attrs/src/attr/converters.pyi b/tools/third_party/attrs/src/attr/converters.pyi index 5abb49f6d5a8c3..12bd0c4f17bdc6 100644 --- a/tools/third_party/attrs/src/attr/converters.pyi +++ b/tools/third_party/attrs/src/attr/converters.pyi @@ -1,13 +1,19 @@ -from typing import Callable, TypeVar, overload +from typing import Callable, Any, overload -from . import _ConverterType - -_T = TypeVar("_T") +from attrs import _ConverterType, _CallableConverterType +@overload +def pipe(*validators: _CallableConverterType) -> _CallableConverterType: ... +@overload def pipe(*validators: _ConverterType) -> _ConverterType: ... +@overload +def optional(converter: _CallableConverterType) -> _CallableConverterType: ... +@overload def optional(converter: _ConverterType) -> _ConverterType: ... @overload -def default_if_none(default: _T) -> _ConverterType: ... +def default_if_none(default: Any) -> _CallableConverterType: ... @overload -def default_if_none(*, factory: Callable[[], _T]) -> _ConverterType: ... -def to_bool(val: str) -> bool: ... +def default_if_none( + *, factory: Callable[[], Any] +) -> _CallableConverterType: ... +def to_bool(val: str | int | bool) -> bool: ... diff --git a/tools/third_party/attrs/src/attr/filters.py b/tools/third_party/attrs/src/attr/filters.py index a1e40c98db853a..689b1705a60ff1 100644 --- a/tools/third_party/attrs/src/attr/filters.py +++ b/tools/third_party/attrs/src/attr/filters.py @@ -1,7 +1,7 @@ # SPDX-License-Identifier: MIT """ -Commonly useful filters for `attr.asdict`. +Commonly useful filters for `attrs.asdict` and `attrs.astuple`. """ from ._make import Attribute @@ -20,13 +20,16 @@ def _split_what(what): def include(*what): """ - Include *what*. + Create a filter that only allows *what*. - :param what: What to include. - :type what: `list` of classes `type`, field names `str` or - `attrs.Attribute`\\ s + Args: + what (list[type, str, attrs.Attribute]): + What to include. Can be a type, a name, or an attribute. - :rtype: `callable` + Returns: + Callable: + A callable that can be passed to `attrs.asdict`'s and + `attrs.astuple`'s *filter* argument. .. versionchanged:: 23.1.0 Accept strings with field names. """ @@ -44,13 +47,16 @@ def include_(attribute, value): def exclude(*what): """ - Exclude *what*. + Create a filter that does **not** allow *what*. - :param what: What to exclude. - :type what: `list` of classes `type`, field names `str` or - `attrs.Attribute`\\ s. + Args: + what (list[type, str, attrs.Attribute]): + What to exclude. Can be a type, a name, or an attribute. - :rtype: `callable` + Returns: + Callable: + A callable that can be passed to `attrs.asdict`'s and + `attrs.astuple`'s *filter* argument. .. versionchanged:: 23.3.0 Accept field name string as input argument """ diff --git a/tools/third_party/attrs/src/attr/filters.pyi b/tools/third_party/attrs/src/attr/filters.pyi index 8a02fa0fc0631d..974abdcdb51152 100644 --- a/tools/third_party/attrs/src/attr/filters.pyi +++ b/tools/third_party/attrs/src/attr/filters.pyi @@ -1,6 +1,6 @@ -from typing import Any, Union +from typing import Any from . import Attribute, _FilterType -def include(*what: Union[type, str, Attribute[Any]]) -> _FilterType[Any]: ... -def exclude(*what: Union[type, str, Attribute[Any]]) -> _FilterType[Any]: ... +def include(*what: type | str | Attribute[Any]) -> _FilterType[Any]: ... +def exclude(*what: type | str | Attribute[Any]) -> _FilterType[Any]: ... diff --git a/tools/third_party/attrs/src/attr/setters.py b/tools/third_party/attrs/src/attr/setters.py index 12ed6750df35b9..78b08398a6713f 100644 --- a/tools/third_party/attrs/src/attr/setters.py +++ b/tools/third_party/attrs/src/attr/setters.py @@ -4,7 +4,6 @@ Commonly used hooks for on_setattr. """ - from . import _config from .exceptions import FrozenAttributeError @@ -33,7 +32,7 @@ def frozen(_, __, ___): .. versionadded:: 20.1.0 """ - raise FrozenAttributeError() + raise FrozenAttributeError def validate(instance, attrib, new_value): @@ -56,18 +55,25 @@ def validate(instance, attrib, new_value): def convert(instance, attrib, new_value): """ - Run *attrib*'s converter -- if it has one -- on *new_value* and return the + Run *attrib*'s converter -- if it has one -- on *new_value* and return the result. .. versionadded:: 20.1.0 """ c = attrib.converter if c: - return c(new_value) + # This can be removed once we drop 3.8 and use attrs.Converter instead. + from ._make import Converter + + if not isinstance(c, Converter): + return c(new_value) + + return c(new_value, instance, attrib) return new_value # Sentinel for disabling class-wide *on_setattr* hooks for certain attributes. -# autodata stopped working, so the docstring is inlined in the API docs. +# Sphinx's autodata stopped working, so the docstring is inlined in the API +# docs. NO_OP = object() diff --git a/tools/third_party/attrs/src/attr/setters.pyi b/tools/third_party/attrs/src/attr/setters.pyi index 72f7ce4761c343..73abf36e7d5b0f 100644 --- a/tools/third_party/attrs/src/attr/setters.pyi +++ b/tools/third_party/attrs/src/attr/setters.pyi @@ -1,6 +1,7 @@ from typing import Any, NewType, NoReturn, TypeVar -from . import Attribute, _OnSetAttrType +from . import Attribute +from attrs import _OnSetAttrType _T = TypeVar("_T") diff --git a/tools/third_party/attrs/src/attr/validators.py b/tools/third_party/attrs/src/attr/validators.py index 34d6b761d37857..e7b75525022a9d 100644 --- a/tools/third_party/attrs/src/attr/validators.py +++ b/tools/third_party/attrs/src/attr/validators.py @@ -4,7 +4,6 @@ Commonly useful validators. """ - import operator import re @@ -35,7 +34,7 @@ "min_len", "not_", "optional", - "provides", + "or_", "set_disabled", ] @@ -46,8 +45,8 @@ def set_disabled(disabled): By default, they are run. - :param disabled: If ``True``, disable running all validators. - :type disabled: bool + Args: + disabled (bool): If `True`, disable running all validators. .. warning:: @@ -62,8 +61,8 @@ def get_disabled(): """ Return a bool indicating whether validators are currently disabled or not. - :return: ``True`` if validators are currently disabled. - :rtype: bool + Returns: + bool:`True` if validators are currently disabled. .. versionadded:: 21.3.0 """ @@ -88,7 +87,7 @@ def disabled(): set_run_validators(True) -@attrs(repr=False, slots=True, hash=True) +@attrs(repr=False, slots=True, unsafe_hash=True) class _InstanceOfValidator: type = attrib() @@ -97,12 +96,7 @@ def __call__(self, inst, attr, value): We use a callable class to be able to change the ``__repr__``. """ if not isinstance(value, self.type): - msg = "'{name}' must be {type!r} (got {value!r} that is a {actual!r}).".format( - name=attr.name, - type=self.type, - actual=value.__class__, - value=value, - ) + msg = f"'{attr.name}' must be {self.type!r} (got {value!r} that is a {value.__class__!r})." raise TypeError( msg, attr, @@ -116,16 +110,17 @@ def __repr__(self): def instance_of(type): """ - A validator that raises a `TypeError` if the initializer is called - with a wrong type for this particular attribute (checks are performed using + A validator that raises a `TypeError` if the initializer is called with a + wrong type for this particular attribute (checks are performed using `isinstance` therefore it's also valid to pass a tuple of types). - :param type: The type to check for. - :type type: type or tuple of type + Args: + type (type | tuple[type]): The type to check for. - :raises TypeError: With a human readable error message, the attribute - (of type `attrs.Attribute`), the expected type, and the value it - got. + Raises: + TypeError: + With a human readable error message, the attribute (of type + `attrs.Attribute`), the expected type, and the value it got. """ return _InstanceOfValidator(type) @@ -140,9 +135,7 @@ def __call__(self, inst, attr, value): We use a callable class to be able to change the ``__repr__``. """ if not self.match_func(value): - msg = "'{name}' must match regex {pattern!r} ({value!r} doesn't)".format( - name=attr.name, pattern=self.pattern.pattern, value=value - ) + msg = f"'{attr.name}' must match regex {self.pattern.pattern!r} ({value!r} doesn't)" raise ValueError( msg, attr, @@ -156,16 +149,21 @@ def __repr__(self): def matches_re(regex, flags=0, func=None): r""" - A validator that raises `ValueError` if the initializer is called - with a string that doesn't match *regex*. + A validator that raises `ValueError` if the initializer is called with a + string that doesn't match *regex*. - :param regex: a regex string or precompiled pattern to match against - :param int flags: flags that will be passed to the underlying re function - (default 0) - :param callable func: which underlying `re` function to call. Valid options - are `re.fullmatch`, `re.search`, and `re.match`; the default ``None`` - means `re.fullmatch`. For performance reasons, the pattern is always - precompiled using `re.compile`. + Args: + regex (str, re.Pattern): + A regex string or precompiled pattern to match against + + flags (int): + Flags that will be passed to the underlying re function (default 0) + + func (typing.Callable): + Which underlying `re` function to call. Valid options are + `re.fullmatch`, `re.search`, and `re.match`; the default `None` + means `re.fullmatch`. For performance reasons, the pattern is + always precompiled using `re.compile`. .. versionadded:: 19.2.0 .. versionchanged:: 21.3.0 *regex* can be a pre-compiled pattern. @@ -174,7 +172,7 @@ def matches_re(regex, flags=0, func=None): if func not in valid_funcs: msg = "'func' must be one of {}.".format( ", ".join( - sorted(e and e.__name__ or "None" for e in set(valid_funcs)) + sorted((e and e.__name__) or "None" for e in set(valid_funcs)) ) ) raise ValueError(msg) @@ -197,57 +195,7 @@ def matches_re(regex, flags=0, func=None): return _MatchesReValidator(pattern, match_func) -@attrs(repr=False, slots=True, hash=True) -class _ProvidesValidator: - interface = attrib() - - def __call__(self, inst, attr, value): - """ - We use a callable class to be able to change the ``__repr__``. - """ - if not self.interface.providedBy(value): - msg = "'{name}' must provide {interface!r} which {value!r} doesn't.".format( - name=attr.name, interface=self.interface, value=value - ) - raise TypeError( - msg, - attr, - self.interface, - value, - ) - - def __repr__(self): - return f"" - - -def provides(interface): - """ - A validator that raises a `TypeError` if the initializer is called - with an object that does not provide the requested *interface* (checks are - performed using ``interface.providedBy(value)`` (see `zope.interface - `_). - - :param interface: The interface to check for. - :type interface: ``zope.interface.Interface`` - - :raises TypeError: With a human readable error message, the attribute - (of type `attrs.Attribute`), the expected interface, and the - value it got. - - .. deprecated:: 23.1.0 - """ - import warnings - - warnings.warn( - "attrs's zope-interface support is deprecated and will be removed in, " - "or after, April 2024.", - DeprecationWarning, - stacklevel=2, - ) - return _ProvidesValidator(interface) - - -@attrs(repr=False, slots=True, hash=True) +@attrs(repr=False, slots=True, unsafe_hash=True) class _OptionalValidator: validator = attrib() @@ -264,11 +212,13 @@ def __repr__(self): def optional(validator): """ A validator that makes an attribute optional. An optional attribute is one - which can be set to ``None`` in addition to satisfying the requirements of + which can be set to `None` in addition to satisfying the requirements of the sub-validator. - :param Callable | tuple[Callable] | list[Callable] validator: A validator - (or validators) that is used for non-``None`` values. + Args: + validator + (typing.Callable | tuple[typing.Callable] | list[typing.Callable]): + A validator (or validators) that is used for non-`None` values. .. versionadded:: 15.1.0 .. versionchanged:: 17.1.0 *validator* can be a list of validators. @@ -280,9 +230,10 @@ def optional(validator): return _OptionalValidator(validator) -@attrs(repr=False, slots=True, hash=True) +@attrs(repr=False, slots=True, unsafe_hash=True) class _InValidator: options = attrib() + _original_options = attrib(hash=False) def __call__(self, inst, attr, value): try: @@ -291,41 +242,54 @@ def __call__(self, inst, attr, value): in_options = False if not in_options: - msg = f"'{attr.name}' must be in {self.options!r} (got {value!r})" + msg = f"'{attr.name}' must be in {self._original_options!r} (got {value!r})" raise ValueError( msg, attr, - self.options, + self._original_options, value, ) def __repr__(self): - return f"" + return f"" def in_(options): """ - A validator that raises a `ValueError` if the initializer is called - with a value that does not belong in the options provided. The check is - performed using ``value in options``. + A validator that raises a `ValueError` if the initializer is called with a + value that does not belong in the *options* provided. + + The check is performed using ``value in options``, so *options* has to + support that operation. - :param options: Allowed options. - :type options: list, tuple, `enum.Enum`, ... + To keep the validator hashable, dicts, lists, and sets are transparently + transformed into a `tuple`. - :raises ValueError: With a human readable error message, the attribute (of - type `attrs.Attribute`), the expected options, and the value it - got. + Args: + options: Allowed options. + + Raises: + ValueError: + With a human readable error message, the attribute (of type + `attrs.Attribute`), the expected options, and the value it got. .. versionadded:: 17.1.0 .. versionchanged:: 22.1.0 The ValueError was incomplete until now and only contained the human readable error message. Now it contains all the information that has been promised since 17.1.0. + .. versionchanged:: 24.1.0 + *options* that are a list, dict, or a set are now transformed into a + tuple to keep the validator hashable. """ - return _InValidator(options) + repr_options = options + if isinstance(options, (list, dict, set)): + options = tuple(options) + + return _InValidator(options, repr_options) -@attrs(repr=False, slots=False, hash=True) +@attrs(repr=False, slots=False, unsafe_hash=True) class _IsCallableValidator: def __call__(self, inst, attr, value): """ @@ -350,19 +314,20 @@ def __repr__(self): def is_callable(): """ A validator that raises a `attrs.exceptions.NotCallableError` if the - initializer is called with a value for this particular attribute - that is not callable. + initializer is called with a value for this particular attribute that is + not callable. .. versionadded:: 19.1.0 - :raises attrs.exceptions.NotCallableError: With a human readable error - message containing the attribute (`attrs.Attribute`) name, - and the value it got. + Raises: + attrs.exceptions.NotCallableError: + With a human readable error message containing the attribute + (`attrs.Attribute`) name, and the value it got. """ return _IsCallableValidator() -@attrs(repr=False, slots=True, hash=True) +@attrs(repr=False, slots=True, unsafe_hash=True) class _DeepIterable: member_validator = attrib(validator=is_callable()) iterable_validator = attrib( @@ -395,20 +360,23 @@ def deep_iterable(member_validator, iterable_validator=None): """ A validator that performs deep validation of an iterable. - :param member_validator: Validator(s) to apply to iterable members - :param iterable_validator: Validator to apply to iterable itself - (optional) + Args: + member_validator: Validator to apply to iterable members. - .. versionadded:: 19.1.0 + iterable_validator: + Validator to apply to iterable itself (optional). + + Raises + TypeError: if any sub-validators fail - :raises TypeError: if any sub-validators fail + .. versionadded:: 19.1.0 """ if isinstance(member_validator, (list, tuple)): member_validator = and_(*member_validator) return _DeepIterable(member_validator, iterable_validator) -@attrs(repr=False, slots=True, hash=True) +@attrs(repr=False, slots=True, unsafe_hash=True) class _DeepMapping: key_validator = attrib(validator=is_callable()) value_validator = attrib(validator=is_callable()) @@ -426,23 +394,25 @@ def __call__(self, inst, attr, value): self.value_validator(inst, attr, value[key]) def __repr__(self): - return ( - "" - ).format(key=self.key_validator, value=self.value_validator) + return f"" def deep_mapping(key_validator, value_validator, mapping_validator=None): """ A validator that performs deep validation of a dictionary. - :param key_validator: Validator to apply to dictionary keys - :param value_validator: Validator to apply to dictionary values - :param mapping_validator: Validator to apply to top-level mapping - attribute (optional) + Args: + key_validator: Validator to apply to dictionary keys. + + value_validator: Validator to apply to dictionary values. + + mapping_validator: + Validator to apply to top-level mapping attribute (optional). .. versionadded:: 19.1.0 - :raises TypeError: if any sub-validators fail + Raises: + TypeError: if any sub-validators fail """ return _DeepMapping(key_validator, value_validator, mapping_validator) @@ -467,10 +437,13 @@ def __repr__(self): def lt(val): """ - A validator that raises `ValueError` if the initializer is called - with a number larger or equal to *val*. + A validator that raises `ValueError` if the initializer is called with a + number larger or equal to *val*. - :param val: Exclusive upper bound for values + The validator uses `operator.lt` to compare the values. + + Args: + val: Exclusive upper bound for values. .. versionadded:: 21.3.0 """ @@ -479,10 +452,13 @@ def lt(val): def le(val): """ - A validator that raises `ValueError` if the initializer is called - with a number greater than *val*. + A validator that raises `ValueError` if the initializer is called with a + number greater than *val*. - :param val: Inclusive upper bound for values + The validator uses `operator.le` to compare the values. + + Args: + val: Inclusive upper bound for values. .. versionadded:: 21.3.0 """ @@ -491,10 +467,13 @@ def le(val): def ge(val): """ - A validator that raises `ValueError` if the initializer is called - with a number smaller than *val*. + A validator that raises `ValueError` if the initializer is called with a + number smaller than *val*. - :param val: Inclusive lower bound for values + The validator uses `operator.ge` to compare the values. + + Args: + val: Inclusive lower bound for values .. versionadded:: 21.3.0 """ @@ -503,10 +482,13 @@ def ge(val): def gt(val): """ - A validator that raises `ValueError` if the initializer is called - with a number smaller or equal to *val*. + A validator that raises `ValueError` if the initializer is called with a + number smaller or equal to *val*. - :param val: Exclusive lower bound for values + The validator uses `operator.ge` to compare the values. + + Args: + val: Exclusive lower bound for values .. versionadded:: 21.3.0 """ @@ -534,7 +516,8 @@ def max_len(length): A validator that raises `ValueError` if the initializer is called with a string or iterable that is longer than *length*. - :param int length: Maximum length of the string or iterable + Args: + length (int): Maximum length of the string or iterable .. versionadded:: 21.3.0 """ @@ -562,14 +545,15 @@ def min_len(length): A validator that raises `ValueError` if the initializer is called with a string or iterable that is shorter than *length*. - :param int length: Minimum length of the string or iterable + Args: + length (int): Minimum length of the string or iterable .. versionadded:: 22.1.0 """ return _MinLengthValidator(length) -@attrs(repr=False, slots=True, hash=True) +@attrs(repr=False, slots=True, unsafe_hash=True) class _SubclassOfValidator: type = attrib() @@ -592,21 +576,22 @@ def __repr__(self): def _subclass_of(type): """ - A validator that raises a `TypeError` if the initializer is called - with a wrong type for this particular attribute (checks are performed using + A validator that raises a `TypeError` if the initializer is called with a + wrong type for this particular attribute (checks are performed using `issubclass` therefore it's also valid to pass a tuple of types). - :param type: The type to check for. - :type type: type or tuple of types + Args: + type (type | tuple[type, ...]): The type(s) to check for. - :raises TypeError: With a human readable error message, the attribute - (of type `attrs.Attribute`), the expected type, and the value it - got. + Raises: + TypeError: + With a human readable error message, the attribute (of type + `attrs.Attribute`), the expected type, and the value it got. """ return _SubclassOfValidator(type) -@attrs(repr=False, slots=True, hash=True) +@attrs(repr=False, slots=True, unsafe_hash=True) class _NotValidator: validator = attrib() msg = attrib( @@ -640,12 +625,7 @@ def __call__(self, inst, attr, value): ) def __repr__(self): - return ( - "" - ).format( - what=self.validator, - exc_types=self.exc_types, - ) + return f"" def not_(validator, *, msg=None, exc_types=(ValueError, TypeError)): @@ -658,19 +638,22 @@ def not_(validator, *, msg=None, exc_types=(ValueError, TypeError)): Intended to be used with existing validators to compose logic without needing to create inverted variants, for example, ``not_(in_(...))``. - :param validator: A validator to be logically inverted. - :param msg: Message to raise if validator fails. - Formatted with keys ``exc_types`` and ``validator``. - :type msg: str - :param exc_types: Exception type(s) to capture. - Other types raised by child validators will not be intercepted and - pass through. + Args: + validator: A validator to be logically inverted. + + msg (str): + Message to raise if validator fails. Formatted with keys + ``exc_types`` and ``validator``. - :raises ValueError: With a human readable error message, - the attribute (of type `attrs.Attribute`), - the validator that failed to raise an exception, - the value it got, - and the expected exception types. + exc_types (tuple[type, ...]): + Exception type(s) to capture. Other types raised by child + validators will not be intercepted and pass through. + + Raises: + ValueError: + With a human readable error message, the attribute (of type + `attrs.Attribute`), the validator that failed to raise an + exception, the value it got, and the expected exception types. .. versionadded:: 22.2.0 """ @@ -679,3 +662,49 @@ def not_(validator, *, msg=None, exc_types=(ValueError, TypeError)): except TypeError: exc_types = (exc_types,) return _NotValidator(validator, msg, exc_types) + + +@attrs(repr=False, slots=True, unsafe_hash=True) +class _OrValidator: + validators = attrib() + + def __call__(self, inst, attr, value): + for v in self.validators: + try: + v(inst, attr, value) + except Exception: # noqa: BLE001, PERF203, S112 + continue + else: + return + + msg = f"None of {self.validators!r} satisfied for value {value!r}" + raise ValueError(msg) + + def __repr__(self): + return f"" + + +def or_(*validators): + """ + A validator that composes multiple validators into one. + + When called on a value, it runs all wrapped validators until one of them is + satisfied. + + Args: + validators (~collections.abc.Iterable[typing.Callable]): + Arbitrary number of validators. + + Raises: + ValueError: + If no validator is satisfied. Raised with a human-readable error + message listing all the wrapped validators and the value that + failed all of them. + + .. versionadded:: 24.1.0 + """ + vals = [] + for v in validators: + vals.extend(v.validators if isinstance(v, _OrValidator) else [v]) + + return _OrValidator(tuple(vals)) diff --git a/tools/third_party/attrs/src/attr/validators.pyi b/tools/third_party/attrs/src/attr/validators.pyi index d194a75abcacfa..a0fdda7c8773f7 100644 --- a/tools/third_party/attrs/src/attr/validators.pyi +++ b/tools/third_party/attrs/src/attr/validators.pyi @@ -1,3 +1,4 @@ +from types import UnionType from typing import ( Any, AnyStr, @@ -5,20 +6,15 @@ from typing import ( Container, ContextManager, Iterable, - List, Mapping, Match, - Optional, Pattern, - Tuple, - Type, TypeVar, - Union, overload, ) -from . import _ValidatorType -from . import _ValidatorArgType +from attrs import _ValidatorType +from attrs import _ValidatorArgType _T = TypeVar("_T") _T1 = TypeVar("_T1") @@ -36,42 +32,43 @@ def disabled() -> ContextManager[None]: ... # To be more precise on instance_of use some overloads. # If there are more than 3 items in the tuple then we fall back to Any @overload -def instance_of(type: Type[_T]) -> _ValidatorType[_T]: ... +def instance_of(type: type[_T]) -> _ValidatorType[_T]: ... @overload -def instance_of(type: Tuple[Type[_T]]) -> _ValidatorType[_T]: ... +def instance_of(type: tuple[type[_T]]) -> _ValidatorType[_T]: ... @overload def instance_of( - type: Tuple[Type[_T1], Type[_T2]] -) -> _ValidatorType[Union[_T1, _T2]]: ... + type: tuple[type[_T1], type[_T2]], +) -> _ValidatorType[_T1 | _T2]: ... @overload def instance_of( - type: Tuple[Type[_T1], Type[_T2], Type[_T3]] -) -> _ValidatorType[Union[_T1, _T2, _T3]]: ... + type: tuple[type[_T1], type[_T2], type[_T3]], +) -> _ValidatorType[_T1 | _T2 | _T3]: ... @overload -def instance_of(type: Tuple[type, ...]) -> _ValidatorType[Any]: ... -def provides(interface: Any) -> _ValidatorType[Any]: ... +def instance_of(type: tuple[type, ...]) -> _ValidatorType[Any]: ... +@overload +def instance_of(type: UnionType) -> _ValidatorType[Any]: ... def optional( - validator: Union[ - _ValidatorType[_T], List[_ValidatorType[_T]], Tuple[_ValidatorType[_T]] - ] -) -> _ValidatorType[Optional[_T]]: ... + validator: ( + _ValidatorType[_T] + | list[_ValidatorType[_T]] + | tuple[_ValidatorType[_T]] + ), +) -> _ValidatorType[_T | None]: ... def in_(options: Container[_T]) -> _ValidatorType[_T]: ... def and_(*validators: _ValidatorType[_T]) -> _ValidatorType[_T]: ... def matches_re( - regex: Union[Pattern[AnyStr], AnyStr], + regex: Pattern[AnyStr] | AnyStr, flags: int = ..., - func: Optional[ - Callable[[AnyStr, AnyStr, int], Optional[Match[AnyStr]]] - ] = ..., + func: Callable[[AnyStr, AnyStr, int], Match[AnyStr] | None] | None = ..., ) -> _ValidatorType[AnyStr]: ... def deep_iterable( member_validator: _ValidatorArgType[_T], - iterable_validator: Optional[_ValidatorType[_I]] = ..., + iterable_validator: _ValidatorType[_I] | None = ..., ) -> _ValidatorType[_I]: ... def deep_mapping( key_validator: _ValidatorType[_K], value_validator: _ValidatorType[_V], - mapping_validator: Optional[_ValidatorType[_M]] = ..., + mapping_validator: _ValidatorType[_M] | None = ..., ) -> _ValidatorType[_M]: ... def is_callable() -> _ValidatorType[_T]: ... def lt(val: _T) -> _ValidatorType[_T]: ... @@ -83,6 +80,7 @@ def min_len(length: int) -> _ValidatorType[_T]: ... def not_( validator: _ValidatorType[_T], *, - msg: Optional[str] = None, - exc_types: Union[Type[Exception], Iterable[Type[Exception]]] = ..., + msg: str | None = None, + exc_types: type[Exception] | Iterable[type[Exception]] = ..., ) -> _ValidatorType[_T]: ... +def or_(*validators: _ValidatorType[_T]) -> _ValidatorType[_T]: ... diff --git a/tools/third_party/attrs/src/attrs/__init__.py b/tools/third_party/attrs/src/attrs/__init__.py index 0c2481561a93a9..e8023ff6c5783a 100644 --- a/tools/third_party/attrs/src/attrs/__init__.py +++ b/tools/third_party/attrs/src/attrs/__init__.py @@ -4,7 +4,9 @@ NOTHING, Attribute, AttrsInstance, + Converter, Factory, + NothingType, _make_getattr, assoc, cmp_using, @@ -26,6 +28,12 @@ __all__ = [ + "NOTHING", + "Attribute", + "AttrsInstance", + "Converter", + "Factory", + "NothingType", "__author__", "__copyright__", "__description__", @@ -39,23 +47,19 @@ "asdict", "assoc", "astuple", - "Attribute", - "AttrsInstance", "cmp_using", "converters", "define", "evolve", "exceptions", - "Factory", "field", - "fields_dict", "fields", + "fields_dict", "filters", "frozen", "has", "make_class", "mutable", - "NOTHING", "resolve_types", "setters", "validate", diff --git a/tools/third_party/attrs/src/attrs/__init__.pyi b/tools/third_party/attrs/src/attrs/__init__.pyi index 9372cfea16e897..648fa7a344433d 100644 --- a/tools/third_party/attrs/src/attrs/__init__.pyi +++ b/tools/third_party/attrs/src/attrs/__init__.pyi @@ -1,12 +1,12 @@ +import sys + from typing import ( Any, Callable, - Dict, Mapping, - Optional, Sequence, - Tuple, - Type, + overload, + TypeVar, ) # Because we need to type our own stuff, we have to make everything from @@ -20,48 +20,244 @@ from attr import __title__ as __title__ from attr import __url__ as __url__ from attr import __version__ as __version__ from attr import __version_info__ as __version_info__ -from attr import _FilterType from attr import assoc as assoc from attr import Attribute as Attribute from attr import AttrsInstance as AttrsInstance from attr import cmp_using as cmp_using from attr import converters as converters -from attr import define as define +from attr import Converter as Converter from attr import evolve as evolve from attr import exceptions as exceptions from attr import Factory as Factory -from attr import field as field from attr import fields as fields from attr import fields_dict as fields_dict from attr import filters as filters -from attr import frozen as frozen from attr import has as has from attr import make_class as make_class -from attr import mutable as mutable from attr import NOTHING as NOTHING from attr import resolve_types as resolve_types from attr import setters as setters from attr import validate as validate from attr import validators as validators +from attr import attrib, asdict as asdict, astuple as astuple +from attr import NothingType as NothingType + +if sys.version_info >= (3, 11): + from typing import dataclass_transform +else: + from typing_extensions import dataclass_transform + +_T = TypeVar("_T") +_C = TypeVar("_C", bound=type) + +_EqOrderType = bool | Callable[[Any], Any] +_ValidatorType = Callable[[Any, "Attribute[_T]", _T], Any] +_CallableConverterType = Callable[[Any], Any] +_ConverterType = _CallableConverterType | Converter[Any, Any] +_ReprType = Callable[[Any], str] +_ReprArgType = bool | _ReprType +_OnSetAttrType = Callable[[Any, "Attribute[Any]", Any], Any] +_OnSetAttrArgType = _OnSetAttrType | list[_OnSetAttrType] | setters._NoOpType +_FieldTransformer = Callable[ + [type, list["Attribute[Any]"]], list["Attribute[Any]"] +] +# FIXME: in reality, if multiple validators are passed they must be in a list +# or tuple, but those are invariant and so would prevent subtypes of +# _ValidatorType from working when passed in a list or tuple. +_ValidatorArgType = _ValidatorType[_T] | Sequence[_ValidatorType[_T]] + +@overload +def field( + *, + default: None = ..., + validator: None = ..., + repr: _ReprArgType = ..., + hash: bool | None = ..., + init: bool = ..., + metadata: Mapping[Any, Any] | None = ..., + converter: None = ..., + factory: None = ..., + kw_only: bool = ..., + eq: bool | None = ..., + order: bool | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., + type: type | None = ..., +) -> Any: ... + +# This form catches an explicit None or no default and infers the type from the +# other arguments. +@overload +def field( + *, + default: None = ..., + validator: _ValidatorArgType[_T] | None = ..., + repr: _ReprArgType = ..., + hash: bool | None = ..., + init: bool = ..., + metadata: Mapping[Any, Any] | None = ..., + converter: _ConverterType + | list[_ConverterType] + | tuple[_ConverterType] + | None = ..., + factory: Callable[[], _T] | None = ..., + kw_only: bool = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., + type: type | None = ..., +) -> _T: ... + +# This form catches an explicit default argument. +@overload +def field( + *, + default: _T, + validator: _ValidatorArgType[_T] | None = ..., + repr: _ReprArgType = ..., + hash: bool | None = ..., + init: bool = ..., + metadata: Mapping[Any, Any] | None = ..., + converter: _ConverterType + | list[_ConverterType] + | tuple[_ConverterType] + | None = ..., + factory: Callable[[], _T] | None = ..., + kw_only: bool = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., + type: type | None = ..., +) -> _T: ... + +# This form covers type=non-Type: e.g. forward references (str), Any +@overload +def field( + *, + default: _T | None = ..., + validator: _ValidatorArgType[_T] | None = ..., + repr: _ReprArgType = ..., + hash: bool | None = ..., + init: bool = ..., + metadata: Mapping[Any, Any] | None = ..., + converter: _ConverterType + | list[_ConverterType] + | tuple[_ConverterType] + | None = ..., + factory: Callable[[], _T] | None = ..., + kw_only: bool = ..., + eq: _EqOrderType | None = ..., + order: _EqOrderType | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + alias: str | None = ..., + type: type | None = ..., +) -> Any: ... +@overload +@dataclass_transform(field_specifiers=(attrib, field)) +def define( + maybe_cls: _C, + *, + these: dict[str, Any] | None = ..., + repr: bool = ..., + unsafe_hash: bool | None = ..., + hash: bool | None = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: bool | None = ..., + order: bool | None = ..., + auto_detect: bool = ..., + getstate_setstate: bool | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + field_transformer: _FieldTransformer | None = ..., + match_args: bool = ..., +) -> _C: ... +@overload +@dataclass_transform(field_specifiers=(attrib, field)) +def define( + maybe_cls: None = ..., + *, + these: dict[str, Any] | None = ..., + repr: bool = ..., + unsafe_hash: bool | None = ..., + hash: bool | None = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: bool | None = ..., + order: bool | None = ..., + auto_detect: bool = ..., + getstate_setstate: bool | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + field_transformer: _FieldTransformer | None = ..., + match_args: bool = ..., +) -> Callable[[_C], _C]: ... -# TODO: see definition of attr.asdict/astuple -def asdict( - inst: AttrsInstance, - recurse: bool = ..., - filter: Optional[_FilterType[Any]] = ..., - dict_factory: Type[Mapping[Any, Any]] = ..., - retain_collection_types: bool = ..., - value_serializer: Optional[ - Callable[[type, Attribute[Any], Any], Any] - ] = ..., - tuple_keys: bool = ..., -) -> Dict[str, Any]: ... +mutable = define -# TODO: add support for returning NamedTuple from the mypy plugin -def astuple( - inst: AttrsInstance, - recurse: bool = ..., - filter: Optional[_FilterType[Any]] = ..., - tuple_factory: Type[Sequence[Any]] = ..., - retain_collection_types: bool = ..., -) -> Tuple[Any, ...]: ... +@overload +@dataclass_transform(frozen_default=True, field_specifiers=(attrib, field)) +def frozen( + maybe_cls: _C, + *, + these: dict[str, Any] | None = ..., + repr: bool = ..., + unsafe_hash: bool | None = ..., + hash: bool | None = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: bool | None = ..., + order: bool | None = ..., + auto_detect: bool = ..., + getstate_setstate: bool | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + field_transformer: _FieldTransformer | None = ..., + match_args: bool = ..., +) -> _C: ... +@overload +@dataclass_transform(frozen_default=True, field_specifiers=(attrib, field)) +def frozen( + maybe_cls: None = ..., + *, + these: dict[str, Any] | None = ..., + repr: bool = ..., + unsafe_hash: bool | None = ..., + hash: bool | None = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: bool | None = ..., + order: bool | None = ..., + auto_detect: bool = ..., + getstate_setstate: bool | None = ..., + on_setattr: _OnSetAttrArgType | None = ..., + field_transformer: _FieldTransformer | None = ..., + match_args: bool = ..., +) -> Callable[[_C], _C]: ... diff --git a/tools/third_party/attrs/tests/dataclass_transform_example.py b/tools/third_party/attrs/tests/dataclass_transform_example.py index c65df14026da81..fb28110e6d57be 100644 --- a/tools/third_party/attrs/tests/dataclass_transform_example.py +++ b/tools/third_party/attrs/tests/dataclass_transform_example.py @@ -15,7 +15,6 @@ class Define: @attr.define() class DefineConverter: - # mypy plugin adapts the "int" method signature, pyright does not with_converter: int = attr.field(converter=int) diff --git a/tools/third_party/attrs/tests/strategies.py b/tools/third_party/attrs/tests/strategies.py index 783058f837f6cd..c745e3f6163eeb 100644 --- a/tools/third_party/attrs/tests/strategies.py +++ b/tools/third_party/attrs/tests/strategies.py @@ -3,6 +3,7 @@ """ Testing strategies for Hypothesis-based tests. """ + import functools import keyword import string @@ -13,8 +14,6 @@ import attr -from attr._compat import PY_3_8_PLUS - from .utils import make_class @@ -75,7 +74,7 @@ def _create_hyp_nested_strategy(draw, simple_class_strategy): bare_attrs = st.builds(attr.ib, default=st.none()) int_attrs = st.integers().map(lambda i: attr.ib(default=i)) str_attrs = st.text().map(lambda s: attr.ib(default=s)) -float_attrs = st.floats().map(lambda f: attr.ib(default=f)) +float_attrs = st.floats(allow_nan=False).map(lambda f: attr.ib(default=f)) dict_attrs = st.dictionaries(keys=st.text(), values=st.integers()).map( lambda d: attr.ib(default=d) ) @@ -145,7 +144,7 @@ class HypClass: be generated, and if `slots=False` is passed in, no slotted classes will be generated. The same applies to `frozen` and `weakref_slot`. - By default, some attributes will be private (i.e. prefixed with an + By default, some attributes will be private (those prefixed with an underscore). If `private_attrs=True` is passed in, all attributes will be private, and if `private_attrs=False`, no attributes will be private. """ @@ -189,9 +188,7 @@ def init(self, *args, **kwargs): cls_dict["__init__"] = init bases = (object,) - if cached_property or ( - PY_3_8_PLUS and cached_property is None and cached_property_flag - ): + if cached_property or (cached_property is None and cached_property_flag): class BaseWithCachedProperty: @functools.cached_property diff --git a/tools/third_party/attrs/tests/test_abc.py b/tools/third_party/attrs/tests/test_abc.py index a70b317a3cee1e..5751b284819c34 100644 --- a/tools/third_party/attrs/tests/test_abc.py +++ b/tools/third_party/attrs/tests/test_abc.py @@ -7,10 +7,12 @@ import attrs -from attr._compat import PY310, PY_3_12_PLUS +from attr._compat import PY_3_10_PLUS, PY_3_12_PLUS -@pytest.mark.skipif(not PY310, reason="abc.update_abstractmethods is 3.10+") +@pytest.mark.skipif( + not PY_3_10_PLUS, reason="abc.update_abstractmethods is 3.10+" +) class TestUpdateAbstractMethods: def test_abc_implementation(self, slots): """ diff --git a/tools/third_party/attrs/tests/test_annotations.py b/tools/third_party/attrs/tests/test_annotations.py index d27d9e3743fe93..5c6296634f9cd1 100644 --- a/tools/third_party/attrs/tests/test_annotations.py +++ b/tools/third_party/attrs/tests/test_annotations.py @@ -11,7 +11,9 @@ import pytest import attr +import attrs +from attr._compat import PY_3_14_PLUS from attr._make import _is_class_var from attr.exceptions import UnannotatedAttributeError @@ -60,7 +62,7 @@ class C: x: int = attr.ib(type=int) assert ( - "Type annotation and type argument cannot both be present", + "Type annotation and type argument cannot both be present for 'x'.", ) == e.value.args def test_typing_annotations(self): @@ -106,7 +108,7 @@ def test_auto_attribs(self, slots): class C: cls_var: typing.ClassVar[int] = 23 a: int - x: typing.List[int] = attr.Factory(list) + x: typing.List[int] = attrs.Factory(list) y: int = 2 z: int = attr.ib(default=3) foo: typing.Any = None @@ -120,15 +122,15 @@ class C: attr.resolve_types(C) - assert int == attr.fields(C).a.type + assert int is attr.fields(C).a.type assert attr.Factory(list) == attr.fields(C).x.default - assert typing.List[int] == attr.fields(C).x.type + assert typing.List[int] is attr.fields(C).x.type - assert int == attr.fields(C).y.type + assert int is attr.fields(C).y.type assert 2 == attr.fields(C).y.default - assert int == attr.fields(C).z.type + assert int is attr.fields(C).z.type assert typing.Any == attr.fields(C).foo.type @@ -306,8 +308,11 @@ def test_pipe_empty(self): """ p = attr.converters.pipe() + assert "val" in p.__annotations__ + t = p.__annotations__["val"] + assert isinstance(t, typing.TypeVar) assert p.__annotations__ == {"val": t, "return": t} @@ -403,7 +408,7 @@ class C: cls_var2: "ClassVar[int]" = 23 cls_var3: "t.ClassVar[int]" = 23 a: "int" - x: "typing.List[int]" = attr.Factory(list) + x: "typing.List[int]" = attrs.Factory(list) y: "int" = 2 z: "int" = attr.ib(default=3) foo: "typing.Any" = None @@ -496,7 +501,7 @@ def __eq__(self, other): @attr.s(auto_attribs=True) class C: - x: typing.Any = NonComparable() + x: typing.Any = NonComparable() # noqa: RUF009 def test_basic_resolve(self): """ @@ -534,7 +539,7 @@ class C: attr.resolve_types(C, globals) - assert attr.fields(C).x.type == Annotated[float, "test"] + assert Annotated[float, "test"] is attr.fields(C).x.type @attr.define class D: @@ -542,7 +547,7 @@ class D: attr.resolve_types(D, globals, include_extras=False) - assert attr.fields(D).x.type == float + assert float is attr.fields(D).x.type def test_resolve_types_auto_attrib(self, slots): """ @@ -583,6 +588,8 @@ def test_self_reference(self, slots): """ References to self class using quotes can be resolved. """ + if PY_3_14_PLUS and not slots: + pytest.xfail("References are changing a lot in 3.14.") @attr.s(slots=slots, auto_attribs=True) class A: @@ -598,6 +605,8 @@ def test_forward_reference(self, slots): """ Forward references can be resolved. """ + if PY_3_14_PLUS and not slots: + pytest.xfail("Forward references are changing a lot in 3.14.") @attr.s(slots=slots, auto_attribs=True) class A: @@ -658,8 +667,8 @@ class B(A): attr.resolve_types(A) attr.resolve_types(B) - assert int == attr.fields(A).n.type - assert int == attr.fields(B).n.type + assert int is attr.fields(A).n.type + assert int is attr.fields(B).n.type def test_resolve_twice(self): """ @@ -672,9 +681,12 @@ class A: n: "int" attr.resolve_types(A) - assert int == attr.fields(A).n.type + + assert int is attr.fields(A).n.type + attr.resolve_types(A) - assert int == attr.fields(A).n.type + + assert int is attr.fields(A).n.type @pytest.mark.parametrize( diff --git a/tools/third_party/attrs/tests/test_cmp.py b/tools/third_party/attrs/tests/test_cmp.py index 07bfc5234ade51..8edfbd86b88881 100644 --- a/tools/third_party/attrs/tests/test_cmp.py +++ b/tools/third_party/attrs/tests/test_cmp.py @@ -4,10 +4,10 @@ Tests for methods from `attrib._cmp`. """ - import pytest from attr._cmp import cmp_using +from attr._compat import PY_3_13_PLUS # Test parameters. @@ -54,6 +54,9 @@ cmp_data = eq_data + order_data cmp_ids = eq_ids + order_ids +# Compiler strips indents from docstrings in Python 3.13+ +indent = "" if PY_3_13_PLUS else " " * 8 + class TestEqOrder: """ @@ -325,7 +328,7 @@ def test_ne(self): method = self.cls.__ne__ assert method.__doc__.strip() == ( "Check equality and either forward a NotImplemented or\n" - " return the result negated." + f"{'' if PY_3_13_PLUS else ' ' * 4}return the result negated." ) assert method.__name__ == "__ne__" @@ -393,7 +396,7 @@ def test_ne(self): method = self.cls.__ne__ assert method.__doc__.strip() == ( "Check equality and either forward a NotImplemented or\n" - " return the result negated." + f"{'' if PY_3_13_PLUS else ' ' * 4}return the result negated." ) assert method.__name__ == "__ne__" @@ -465,7 +468,7 @@ def test_ne(self): method = self.cls.__ne__ assert method.__doc__.strip() == ( "Check equality and either forward a NotImplemented or\n" - " return the result negated." + f"{'' if PY_3_13_PLUS else ' ' * 4}return the result negated." ) assert method.__name__ == "__ne__" diff --git a/tools/third_party/attrs/tests/test_compat.py b/tools/third_party/attrs/tests/test_compat.py index c8015b596e2b07..a7cbf3e07caabb 100644 --- a/tools/third_party/attrs/tests/test_compat.py +++ b/tools/third_party/attrs/tests/test_compat.py @@ -2,6 +2,8 @@ import types +from typing import Protocol + import pytest import attr @@ -59,6 +61,5 @@ def test_attrsinstance_subclass_protocol(): It's possible to subclass AttrsInstance and Protocol at once. """ - class Foo(attr.AttrsInstance, attr._compat.Protocol): - def attribute(self) -> int: - ... + class Foo(attr.AttrsInstance, Protocol): + def attribute(self) -> int: ... diff --git a/tools/third_party/attrs/tests/test_config.py b/tools/third_party/attrs/tests/test_config.py index 6c78fd295b5382..9b62e9d9ce5ba3 100644 --- a/tools/third_party/attrs/tests/test_config.py +++ b/tools/third_party/attrs/tests/test_config.py @@ -4,7 +4,6 @@ Tests for `attr._config`. """ - import pytest from attr import _config diff --git a/tools/third_party/attrs/tests/test_converters.py b/tools/third_party/attrs/tests/test_converters.py index 7607e555066c7e..5726ae210a1214 100644 --- a/tools/third_party/attrs/tests/test_converters.py +++ b/tools/third_party/attrs/tests/test_converters.py @@ -4,15 +4,134 @@ Tests for `attr.converters`. """ +import pickle import pytest import attr -from attr import Factory, attrib +from attr import Converter, Factory, attrib +from attr._compat import _AnnotationExtractor from attr.converters import default_if_none, optional, pipe, to_bool +class TestConverter: + @pytest.mark.parametrize("takes_self", [True, False]) + @pytest.mark.parametrize("takes_field", [True, False]) + def test_pickle(self, takes_self, takes_field): + """ + Wrapped converters can be pickled. + """ + c = Converter(int, takes_self=takes_self, takes_field=takes_field) + + new_c = pickle.loads(pickle.dumps(c)) + + assert c == new_c + assert takes_self == new_c.takes_self + assert takes_field == new_c.takes_field + assert c.__call__.__name__ == new_c.__call__.__name__ + + @pytest.mark.parametrize( + "scenario", + [ + ((False, False), "__attr_converter_le_name(le_value)"), + ( + (True, True), + "__attr_converter_le_name(le_value, self, attr_dict['le_name'])", + ), + ( + (True, False), + "__attr_converter_le_name(le_value, self)", + ), + ( + (False, True), + "__attr_converter_le_name(le_value, attr_dict['le_name'])", + ), + ], + ) + def test_fmt_converter_call(self, scenario): + """ + _fmt_converter_call determines the arguments to the wrapped converter + according to `takes_self` and `takes_field`. + """ + (takes_self, takes_field), expect = scenario + + c = Converter(None, takes_self=takes_self, takes_field=takes_field) + + assert expect == c._fmt_converter_call("le_name", "le_value") + + def test_works_as_adapter(self): + """ + Converter instances work as adapters and pass the correct arguments to + the wrapped converter callable. + """ + taken = None + instance = object() + field = object() + + def save_args(*args): + nonlocal taken + taken = args + return args[0] + + Converter(save_args)(42, instance, field) + + assert (42,) == taken + + Converter(save_args, takes_self=True)(42, instance, field) + + assert (42, instance) == taken + + Converter(save_args, takes_field=True)(42, instance, field) + + assert (42, field) == taken + + Converter(save_args, takes_self=True, takes_field=True)( + 42, instance, field + ) + + assert (42, instance, field) == taken + + def test_annotations_if_last_in_pipe(self): + """ + If the wrapped converter has annotations, they are copied to the + Converter __call__. + """ + + def wrapped(_, __, ___) -> float: + pass + + c = Converter(wrapped) + + assert float is c.__call__.__annotations__["return"] + + # Doesn't overwrite globally. + + c2 = Converter(int) + + assert float is c.__call__.__annotations__["return"] + assert None is c2.__call__.__annotations__.get("return") + + def test_falsey_converter(self): + """ + Passing a false-y instance still produces a valid converter. + """ + + class MyConv: + def __bool__(self): + return False + + def __call__(self, value): + return value * 2 + + @attr.s + class C: + a = attrib(converter=MyConv()) + + c = C(21) + assert 42 == c.a + + class TestOptional: """ Tests for `optional`. @@ -43,6 +162,14 @@ def test_fail(self): with pytest.raises(ValueError): c("not_an_int") + def test_converter_instance(self): + """ + Works when passed a Converter instance as argument. + """ + c = optional(Converter(to_bool)) + + assert True is c("yes", None, None) + class TestDefaultIfNone: def test_missing_default(self): @@ -105,9 +232,13 @@ def test_success(self): """ Succeeds if all wrapped converters succeed. """ - c = pipe(str, to_bool, bool) + c = pipe(str, Converter(to_bool), bool) - assert True is c("True") is c(True) + assert ( + True + is c.converter("True", None, None) + is c.converter(True, None, None) + ) def test_fail(self): """ @@ -144,6 +275,71 @@ def test_empty(self): assert o is pipe()(o) + def test_wrapped_annotation(self): + """ + The return type of the wrapped converter is copied into its __call__ + and ultimately into pipe's wrapped converter. + """ + + def last(value) -> bool: + return bool(value) + + @attr.s + class C: + x = attr.ib(converter=[Converter(int), Converter(last)]) + + i = C(5) + + assert True is i.x + assert ( + bool + is _AnnotationExtractor( + attr.fields(C).x.converter.__call__ + ).get_return_type() + ) + + +class TestOptionalPipe: + def test_optional(self): + """ + Nothing happens if None. + """ + c = optional(pipe(str, Converter(to_bool), bool)) + + assert None is c.converter(None, None, None) + + def test_pipe(self): + """ + A value is given, run it through all wrapped converters. + """ + c = optional(pipe(str, Converter(to_bool), bool)) + + assert ( + True + is c.converter("True", None, None) + is c.converter(True, None, None) + ) + + def test_instance(self): + """ + Should work when set as an attrib. + """ + + @attr.s + class C: + x = attrib( + converter=optional(pipe(str, Converter(to_bool), bool)), + default=None, + ) + + c1 = C() + + assert None is c1.x + + c2 = C("True") + + assert True is c2.x + class TestToBool: def test_unhashable(self): diff --git a/tools/third_party/attrs/tests/test_dunders.py b/tools/third_party/attrs/tests/test_dunders.py index d0d289d84c9f15..b98929c982163e 100644 --- a/tools/third_party/attrs/tests/test_dunders.py +++ b/tools/third_party/attrs/tests/test_dunders.py @@ -4,7 +4,6 @@ Tests for dunder methods from `attrib._make`. """ - import copy import inspect import pickle @@ -20,8 +19,8 @@ NOTHING, Factory, _add_repr, - _is_slot_cls, - _make_init, + _compile_and_eval, + _make_init_script, fields, make_class, ) @@ -65,16 +64,16 @@ class OrderCallableCSlots: # HashC is hashable by explicit definition while HashCSlots is hashable # implicitly. The "Cached" versions are the same, except with hash code # caching enabled -HashC = simple_class(hash=True) -HashCSlots = simple_class(hash=None, eq=True, frozen=True, slots=True) -HashCCached = simple_class(hash=True, cache_hash=True) +HashC = simple_class(unsafe_hash=True) +HashCSlots = simple_class(unsafe_hash=None, eq=True, frozen=True, slots=True) +HashCCached = simple_class(unsafe_hash=True, cache_hash=True) HashCSlotsCached = simple_class( - hash=None, eq=True, frozen=True, slots=True, cache_hash=True + unsafe_hash=None, eq=True, frozen=True, slots=True, cache_hash=True ) # the cached hash code is stored slightly differently in this case # so it needs to be tested separately HashCFrozenNotSlotsCached = simple_class( - frozen=True, slots=False, hash=True, cache_hash=True + frozen=True, slots=False, unsafe_hash=True, cache_hash=True ) @@ -87,22 +86,27 @@ def _add_init(cls, frozen): """ has_pre_init = bool(getattr(cls, "__attrs_pre_init__", False)) - cls.__init__ = _make_init( + script, globs, annots = _make_init_script( cls, cls.__attrs_attrs__, has_pre_init, - len(inspect.signature(cls.__attrs_pre_init__).parameters) > 1 - if has_pre_init - else False, + ( + len(inspect.signature(cls.__attrs_pre_init__).parameters) > 1 + if has_pre_init + else False + ), getattr(cls, "__attrs_post_init__", False), frozen, - _is_slot_cls(cls), + "__slots__" in cls.__dict__, cache_hash=False, base_attr_map={}, is_exc=False, cls_on_setattr=None, attrs_init=False, ) + _compile_and_eval(script, globs, filename="__init__") + cls.__init__ = globs["__init__"] + cls.__init__.__annotations__ = annots return cls @@ -442,17 +446,17 @@ def test_str_no_repr(self): # these are for use in TestAddHash.test_cache_hash_serialization # they need to be out here so they can be un-pickled -@attr.attrs(hash=True, cache_hash=False) +@attr.attrs(unsafe_hash=True, cache_hash=False) class HashCacheSerializationTestUncached: foo_value = attr.ib() -@attr.attrs(hash=True, cache_hash=True) +@attr.attrs(unsafe_hash=True, cache_hash=True) class HashCacheSerializationTestCached: foo_value = attr.ib() -@attr.attrs(slots=True, hash=True, cache_hash=True) +@attr.attrs(slots=True, unsafe_hash=True, cache_hash=True) class HashCacheSerializationTestCachedSlots: foo_value = attr.ib() @@ -480,12 +484,12 @@ def test_enforces_type(self): exc_args = ("Invalid value for hash. Must be True, False, or None.",) with pytest.raises(TypeError) as e: - make_class("C", {}, hash=1), + make_class("C", {}, unsafe_hash=1) assert exc_args == e.value.args with pytest.raises(TypeError) as e: - make_class("C", {"a": attr.ib(hash=1)}), + make_class("C", {"a": attr.ib(hash=1)}) assert exc_args == e.value.args @@ -500,13 +504,18 @@ def test_enforce_no_cache_hash_without_hash(self): "enabled.", ) with pytest.raises(TypeError) as e: - make_class("C", {}, hash=False, cache_hash=True) + make_class("C", {}, unsafe_hash=False, cache_hash=True) assert exc_args == e.value.args # unhashable case with pytest.raises(TypeError) as e: make_class( - "C", {}, hash=None, eq=True, frozen=False, cache_hash=True + "C", + {}, + unsafe_hash=None, + eq=True, + frozen=False, + cache_hash=True, ) assert exc_args == e.value.args @@ -520,7 +529,7 @@ def test_enforce_no_cached_hash_without_init(self): " init must be True.", ) with pytest.raises(TypeError) as e: - make_class("C", {}, init=False, hash=True, cache_hash=True) + make_class("C", {}, init=False, unsafe_hash=True, cache_hash=True) assert exc_args == e.value.args @given(booleans(), booleans()) @@ -532,7 +541,7 @@ def test_hash_attribute(self, slots, cache_hash): "C", {"a": attr.ib(hash=False), "b": attr.ib()}, slots=slots, - hash=True, + unsafe_hash=True, cache_hash=cache_hash, ) @@ -628,13 +637,13 @@ def __hash__(self): Uncached = make_class( "Uncached", {"hash_counter": attr.ib(factory=HashCounter)}, - hash=True, + unsafe_hash=True, cache_hash=False, ) Cached = make_class( "Cached", {"hash_counter": attr.ib(factory=HashCounter)}, - hash=True, + unsafe_hash=True, cache_hash=True, ) @@ -659,7 +668,7 @@ def test_copy_hash_cleared(self, cache_hash, frozen, slots): # Give it an explicit hash if we don't have an implicit one if not frozen: - kwargs["hash"] = True + kwargs["unsafe_hash"] = True @attr.s(**kwargs) class C: @@ -710,7 +719,7 @@ def test_copy_two_arg_reduce(self, frozen): __reduce__ generated when cache_hash=True works in that case. """ - @attr.s(frozen=frozen, cache_hash=True, hash=True) + @attr.s(frozen=frozen, cache_hash=True, unsafe_hash=True) class C: x = attr.ib() @@ -835,6 +844,29 @@ class C: assert [] == i.a assert isinstance(i.b, D) + def test_factory_takes_self(self): + """ + If takes_self on factories is True, self is passed. + """ + C = make_class( + "C", + { + "x": attr.ib( + default=Factory((lambda self: self), takes_self=True) + ) + }, + ) + + i = C() + + assert i is i.x + + def test_factory_hashable(self): + """ + Factory is hashable. + """ + assert hash(Factory(None, False)) == hash(Factory(None, False)) + def test_validator(self): """ If a validator is passed, call it with the preliminary instance, the @@ -941,7 +973,7 @@ def test_false(self): assert False is bool(NOTHING) -@attr.s(hash=True, order=True) +@attr.s(unsafe_hash=True, order=True) class C: pass @@ -950,7 +982,7 @@ class C: OriginalC = C -@attr.s(hash=True, order=True) +@attr.s(unsafe_hash=True, order=True) class C: pass @@ -958,9 +990,11 @@ class C: CopyC = C -@attr.s(hash=True, order=True) +@attr.s(unsafe_hash=True, order=True) class C: - """A different class, to generate different methods.""" + """ + A different class, to generate different methods. + """ a = attr.ib() @@ -972,37 +1006,37 @@ def test_filenames(self): """ assert ( OriginalC.__init__.__code__.co_filename - == "" + == "" ) assert ( OriginalC.__eq__.__code__.co_filename - == "" + == "" ) assert ( OriginalC.__hash__.__code__.co_filename - == "" + == "" ) assert ( CopyC.__init__.__code__.co_filename - == "" + == "" ) assert ( CopyC.__eq__.__code__.co_filename - == "" + == "" ) assert ( CopyC.__hash__.__code__.co_filename - == "" + == "" ) assert ( C.__init__.__code__.co_filename - == "" + == "" ) assert ( C.__eq__.__code__.co_filename - == "" + == "" ) assert ( C.__hash__.__code__.co_filename - == "" + == "" ) diff --git a/tools/third_party/attrs/tests/test_filters.py b/tools/third_party/attrs/tests/test_filters.py index 6d237fdc3d13d6..08314fa884015e 100644 --- a/tools/third_party/attrs/tests/test_filters.py +++ b/tools/third_party/attrs/tests/test_filters.py @@ -4,7 +4,6 @@ Tests for `attr.filters`. """ - import pytest import attr diff --git a/tools/third_party/attrs/tests/test_funcs.py b/tools/third_party/attrs/tests/test_funcs.py index 044aaab2c94a8e..67c9bc9d4f05dc 100644 --- a/tools/third_party/attrs/tests/test_funcs.py +++ b/tools/third_party/attrs/tests/test_funcs.py @@ -600,9 +600,7 @@ def test_unknown(self, C): AttrsAttributeNotFoundError. """ # No generated class will have a four letter attribute. - with pytest.raises( - AttrsAttributeNotFoundError - ) as e, pytest.deprecated_call(): + with pytest.raises(AttrsAttributeNotFoundError) as e: assoc(C(), aaaa=2) assert (f"aaaa is not an attrs attribute on {C!r}.",) == e.value.args @@ -783,26 +781,13 @@ class Cls2: obj1a, param1=obj2b ) - def test_inst_kw(self): - """ - If `inst` is passed per kw argument, a warning is raised. - See #1109 - """ - - @attr.s - class C: - pass - - with pytest.warns(DeprecationWarning) as wi: - evolve(inst=C()) - - assert __file__ == wi.list[0].filename - def test_no_inst(self): """ Missing inst argument raises a TypeError like Python would. """ - with pytest.raises(TypeError, match=r"evolve\(\) missing 1"): + with pytest.raises( + TypeError, match=r"evolve\(\) takes 1 positional argument" + ): evolve(x=1) def test_too_many_pos_args(self): diff --git a/tools/third_party/attrs/tests/test_functional.py b/tools/third_party/attrs/tests/test_functional.py index 341ee50a82ae67..7b0317d19da765 100644 --- a/tools/third_party/attrs/tests/test_functional.py +++ b/tools/third_party/attrs/tests/test_functional.py @@ -4,7 +4,7 @@ End-to-end tests. """ - +import copy import inspect import pickle @@ -17,6 +17,7 @@ import attr +from attr._compat import PY_3_13_PLUS from attr._make import NOTHING, Attribute from attr.exceptions import FrozenInstanceError @@ -335,7 +336,7 @@ def test_metaclass_preserved(self, cls): """ Metaclass data is preserved. """ - assert Meta == type(cls) + assert Meta is type(cls) def test_default_decorator(self): """ @@ -380,12 +381,12 @@ class C: def test_hash_by_id(self): """ - With dict classes, hashing by ID is active for hash=False even on - Python 3. This is incorrect behavior but we have to retain it for - backward compatibility. + With dict classes, hashing by ID is active for hash=False. This is + incorrect behavior but we have to retain it for + backwards-compatibility. """ - @attr.s(hash=False) + @attr.s(unsafe_hash=False) class HashByIDBackwardCompat: x = attr.ib() @@ -393,13 +394,13 @@ class HashByIDBackwardCompat: HashByIDBackwardCompat(1) ) - @attr.s(hash=False, eq=False) + @attr.s(unsafe_hash=False, eq=False) class HashByID: x = attr.ib() assert hash(HashByID(1)) != hash(HashByID(1)) - @attr.s(hash=True) + @attr.s(unsafe_hash=True) class HashByValues: x = attr.ib() @@ -422,17 +423,22 @@ class C: class D(C): pass - def test_hash_false_eq_false(self, slots): + def test_unsafe_hash_false_eq_false(self, slots): """ - hash=False and eq=False make a class hashable by ID. + unsafe_hash=False and eq=False make a class hashable by ID. """ - @attr.s(hash=False, eq=False, slots=slots) + @attr.s(unsafe_hash=False, eq=False, slots=slots) class C: pass assert hash(C()) != hash(C()) + def test_hash_deprecated(self): + """ + Using the hash argument is deprecated. + """ + def test_eq_false(self, slots): """ eq=False makes a class hashable by ID. @@ -744,3 +750,58 @@ class Hashable: pass assert hash(Hashable()) + + def test_init_subclass(self, slots): + """ + __attrs_init_subclass__ is called on subclasses. + """ + REGISTRY = [] + + @attr.s(slots=slots) + class Base: + @classmethod + def __attrs_init_subclass__(cls): + REGISTRY.append(cls) + + @attr.s(slots=slots) + class ToRegister(Base): + pass + + assert [ToRegister] == REGISTRY + + +@pytest.mark.skipif(not PY_3_13_PLUS, reason="requires Python 3.13+") +class TestReplace: + def test_replaces(self): + """ + copy.replace() is added by default and works like `attrs.evolve`. + """ + inst = C1(1, 2) + + assert C1(1, 42) == copy.replace(inst, y=42) + assert C1(42, 2) == copy.replace(inst, x=42) + + def test_already_has_one(self): + """ + If the object already has a __replace__, it's left alone. + """ + sentinel = object() + + @attr.s + class C: + x = attr.ib() + + __replace__ = sentinel + + assert sentinel == C.__replace__ + + def test_invalid_field_name(self): + """ + Invalid field names raise a TypeError. + + This is consistent with dataclasses. + """ + inst = C1(1, 2) + + with pytest.raises(TypeError): + copy.replace(inst, z=42) diff --git a/tools/third_party/attrs/tests/test_hooks.py b/tools/third_party/attrs/tests/test_hooks.py index 9c37a98cdc0193..930c750afbdfda 100644 --- a/tools/third_party/attrs/tests/test_hooks.py +++ b/tools/third_party/attrs/tests/test_hooks.py @@ -4,6 +4,8 @@ from datetime import datetime +import pytest + import attr @@ -30,7 +32,7 @@ class C: y = attr.ib(type=int) z: float = attr.ib() - assert results == [("x", None), ("y", int), ("z", float)] + assert [("x", None), ("y", int), ("z", float)] == results def test_hook_applied_auto_attrib(self): """ @@ -49,7 +51,7 @@ class C: x: int y: str = attr.ib() - assert results == [("x", int), ("y", str)] + assert [("x", int), ("y", str)] == results def test_hook_applied_modify_attrib(self): """ @@ -66,7 +68,8 @@ class C: y: float c = C(x="3", y="3.14") - assert c == C(x=3, y=3.14) + + assert C(x=3, y=3.14) == c def test_hook_remove_field(self): """ @@ -82,7 +85,7 @@ class C: x: int y: float - assert attr.asdict(C(2.7)) == {"y": 2.7} + assert {"y": 2.7} == attr.asdict(C(2.7)) def test_hook_add_field(self): """ @@ -98,7 +101,7 @@ def hook(cls, attribs): class C: x: int - assert attr.asdict(C(1, 2)) == {"x": 1, "new": 2} + assert {"x": 1, "new": 2} == attr.asdict(C(1, 2)) def test_hook_override_alias(self): """ @@ -118,13 +121,73 @@ class NameCase: 1, 2, 3 ) + def test_hook_reorder_fields(self): + """ + It is possible to reorder fields via the hook. + """ + + def hook(cls, attribs): + return sorted(attribs, key=lambda x: x.metadata["field_order"]) + + @attr.s(field_transformer=hook) + class C: + x: int = attr.ib(metadata={"field_order": 1}) + y: int = attr.ib(metadata={"field_order": 0}) + + assert {"x": 0, "y": 1} == attr.asdict(C(1, 0)) + + def test_hook_reorder_fields_before_order_check(self): + """ + It is possible to reorder fields via the hook before order-based errors are raised. + + Regression test for #1147. + """ + + def hook(cls, attribs): + return sorted(attribs, key=lambda x: x.metadata["field_order"]) + + @attr.s(field_transformer=hook) + class C: + x: int = attr.ib(metadata={"field_order": 1}, default=0) + y: int = attr.ib(metadata={"field_order": 0}) + + assert {"x": 0, "y": 1} == attr.asdict(C(1)) + + def test_hook_conflicting_defaults_after_reorder(self): + """ + Raises `ValueError` if attributes with defaults are followed by + mandatory attributes after the hook reorders fields. + + Regression test for #1147. + """ + + def hook(cls, attribs): + return sorted(attribs, key=lambda x: x.metadata["field_order"]) + + with pytest.raises(ValueError) as e: + + @attr.s(field_transformer=hook) + class C: + x: int = attr.ib(metadata={"field_order": 1}) + y: int = attr.ib(metadata={"field_order": 0}, default=0) + + assert ( + "No mandatory attributes allowed after an attribute with a " + "default value or factory. Attribute in question: Attribute" + "(name='x', default=NOTHING, validator=None, repr=True, " + "eq=True, eq_key=None, order=True, order_key=None, " + "hash=None, init=True, " + "metadata=mappingproxy({'field_order': 1}), type='int', converter=None, " + "kw_only=False, inherited=False, on_setattr=None, alias=None)", + ) == e.value.args + def test_hook_with_inheritance(self): """ The hook receives all fields from base classes. """ def hook(cls, attribs): - assert [a.name for a in attribs] == ["x", "y"] + assert ["x", "y"] == [a.name for a in attribs] # Remove Base' "x" return attribs[1:] @@ -136,7 +199,7 @@ class Base: class Sub(Base): y: int - assert attr.asdict(Sub(2)) == {"y": 2} + assert {"y": 2} == attr.asdict(Sub(2)) def test_attrs_attrclass(self): """ @@ -151,9 +214,25 @@ class C: x: int fields_type = type(attr.fields(C)) - assert fields_type.__name__ == "CAttributes" + assert "CAttributes" == fields_type.__name__ assert issubclass(fields_type, tuple) + def test_hook_generator(self): + """ + field_transfromers can be a generators. + + Regression test for #1416. + """ + + def hook(cls, attribs): + yield from attribs + + @attr.s(auto_attribs=True, field_transformer=hook) + class Base: + x: int + + assert ["x"] == [a.name for a in attr.fields(Base)] + class TestAsDictHook: def test_asdict(self): @@ -187,12 +266,12 @@ class Parent: ) result = attr.asdict(inst, value_serializer=hook) - assert result == { + assert { "a": {"x": 1, "y": ["2020-07-01T00:00:00"]}, "b": [{"x": 2, "y": ["2020-07-02T00:00:00"]}], "c": {"spam": {"x": 3, "y": ["2020-07-03T00:00:00"]}}, "d": {"eggs": "2020-07-04T00:00:00"}, - } + } == result def test_asdict_calls(self): """ @@ -217,7 +296,7 @@ class Parent: inst = Parent(a=Child(1), b=[Child(2)], c={"spam": Child(3)}) attr.asdict(inst, value_serializer=hook) - assert calls == [ + assert [ (inst, "a", inst.a), (inst.a, "x", inst.a.x), (inst, "b", inst.b), @@ -225,4 +304,4 @@ class Parent: (inst, "c", inst.c), (None, None, "spam"), (inst.c["spam"], "x", inst.c["spam"].x), - ] + ] == calls diff --git a/tools/third_party/attrs/tests/test_make.py b/tools/third_party/attrs/tests/test_make.py index 19f7a4cd412c9a..80c00662b51e40 100644 --- a/tools/third_party/attrs/tests/test_make.py +++ b/tools/third_party/attrs/tests/test_make.py @@ -4,13 +4,13 @@ Tests for `attr._make`. """ - import copy import functools import gc import inspect import itertools import sys +import unicodedata from operator import attrgetter from typing import Generic, TypeVar @@ -23,7 +23,7 @@ import attr from attr import _config -from attr._compat import PY310 +from attr._compat import PY_3_10_PLUS, PY_3_14_PLUS from attr._make import ( Attribute, Factory, @@ -201,7 +201,7 @@ def test_empty(self): class C: pass - assert _Attributes(((), [], {})) == _transform_attrs( + assert _Attributes((), [], {}) == _transform_attrs( C, None, False, False, True, None ) @@ -536,7 +536,7 @@ class C: ("repr", "__repr__"), ("eq", "__eq__"), ("order", "__le__"), - ("hash", "__hash__"), + ("unsafe_hash", "__hash__"), ("init", "__init__"), ], ) @@ -552,7 +552,7 @@ def test_respects_add_arguments(self, arg_name, method_name): "repr": True, "eq": True, "order": True, - "hash": True, + "unsafe_hash": True, "init": True, } am_args[arg_name] = False @@ -602,11 +602,13 @@ def test_repr_fake_qualname(self, slots_outer, slots_inner): Setting repr_ns overrides a potentially guessed namespace. """ - @attr.s(slots=slots_outer) - class C: - @attr.s(repr_ns="C", slots=slots_inner) - class D: - pass + with pytest.deprecated_call(match="The `repr_ns` argument"): + + @attr.s(slots=slots_outer) + class C: + @attr.s(repr_ns="C", slots=slots_inner) + class D: + pass assert "C.D()" == repr(C.D()) @@ -694,6 +696,25 @@ def __attrs_pre_init__(self2, y): assert 12 == getattr(c, "z", None) + @pytest.mark.usefixtures("with_and_without_validation") + def test_pre_init_kw_only_work_with_defaults(self): + """ + Default values together with kw_only don't break __attrs__pre_init__. + """ + val = None + + @attr.define + class KWOnlyAndDefault: + kw_and_default: int = attr.field(kw_only=True, default=3) + + def __attrs_pre_init__(self, *, kw_and_default): + nonlocal val + val = kw_and_default + + inst = KWOnlyAndDefault() + + assert 3 == val == inst.kw_and_default + @pytest.mark.usefixtures("with_and_without_validation") def test_post_init(self): """ @@ -1080,6 +1101,48 @@ def test_attr_args(self): assert repr(C(1)).startswith(" bool: + if name in ("__module__", "__qualname__"): + return False + return orig_hasattr(obj, name) + + monkeypatch.setitem( + _ClassBuilder.__init__.__globals__["__builtins__"], + "hasattr", + our_hasattr, + ) + b = _ClassBuilder( C, these=None, @@ -1727,7 +1849,6 @@ class C: has_custom_setattr=False, field_transformer=None, ) - b._cls = {} # no __module__; no __qualname__ def fake_meth(self): pass @@ -1735,6 +1856,8 @@ def fake_meth(self): fake_meth.__module__ = "42" fake_meth.__qualname__ = "23" + b._cls = {} # No module and qualname + rv = b._add_method_dunders(fake_meth) assert "42" == rv.__module__ == fake_meth.__module__ @@ -1773,11 +1896,33 @@ class C2(C): assert [C2] == C.__subclasses__() + @pytest.mark.xfail(PY_3_14_PLUS, reason="Currently broken on nightly.") + def test_no_references_to_original_when_using_cached_property(self): + """ + When subclassing a slotted class and using cached property, there are + no stray references to the original class. + """ + + @attr.s(slots=True) + class C: + pass + + @attr.s(slots=True) + class C2(C): + @functools.cached_property + def value(self) -> int: + return 0 + + # The original C2 is in a reference cycle, so force a collect: + gc.collect() + + assert [C2] == C.__subclasses__() + def _get_copy_kwargs(include_slots=True): """ Generate a list of compatible attr.s arguments for the `copy` tests. """ - options = ["frozen", "hash", "cache_hash"] + options = ["frozen", "unsafe_hash", "cache_hash"] if include_slots: options.extend(["slots", "weakref_slot"]) @@ -1786,10 +1931,10 @@ def _get_copy_kwargs(include_slots=True): for args in itertools.product([True, False], repeat=len(options)): kwargs = dict(zip(options, args)) - kwargs["hash"] = kwargs["hash"] or None + kwargs["unsafe_hash"] = kwargs["unsafe_hash"] or None if kwargs["cache_hash"] and not ( - kwargs["frozen"] or kwargs["hash"] + kwargs["frozen"] or kwargs["unsafe_hash"] ): continue @@ -2095,7 +2240,6 @@ class TestDocs: "__init__", "__repr__", "__eq__", - "__ne__", "__lt__", "__le__", "__gt__", @@ -2148,7 +2292,7 @@ def test_determine_detects_non_presence_correctly(self, C): def test_make_all_by_default(self, slots, frozen): """ If nothing is there to be detected, imply init=True, repr=True, - hash=None, eq=True, order=True. + unsafe_hash=None, eq=True, order=True. """ @attr.s(auto_detect=True, slots=slots, frozen=frozen) @@ -2201,11 +2345,11 @@ def test_hash_uses_eq(self, slots, frozen): to generate the hash code. """ - @attr.s(slots=slots, frozen=frozen, hash=True) + @attr.s(slots=slots, frozen=frozen, unsafe_hash=True) class C: x = attr.ib(eq=str) - @attr.s(slots=slots, frozen=frozen, hash=True) + @attr.s(slots=slots, frozen=frozen, unsafe_hash=True) class D: x = attr.ib() @@ -2328,10 +2472,10 @@ def __repr__(self): def test_override_hash(self, slots, frozen): """ - If hash=True is passed, ignore __hash__. + If unsafe_hash=True is passed, ignore __hash__. """ - @attr.s(hash=True, auto_detect=True, slots=slots, frozen=frozen) + @attr.s(unsafe_hash=True, auto_detect=True, slots=slots, frozen=frozen) class C: x = attr.ib() @@ -2473,7 +2617,7 @@ def __setstate__(self, state): C, "__getstate__", None ) - @pytest.mark.skipif(PY310, reason="Pre-3.10 only.") + @pytest.mark.skipif(PY_3_10_PLUS, reason="Pre-3.10 only.") def test_match_args_pre_310(self): """ __match_args__ is not created on Python versions older than 3.10. @@ -2486,7 +2630,9 @@ class C: assert None is getattr(C, "__match_args__", None) -@pytest.mark.skipif(not PY310, reason="Structural pattern matching is 3.10+") +@pytest.mark.skipif( + not PY_3_10_PLUS, reason="Structural pattern matching is 3.10+" +) class TestMatchArgs: """ Tests for match_args and __match_args__ generation. diff --git a/tools/third_party/attrs/tests/test_mypy.yml b/tools/third_party/attrs/tests/test_mypy.yml index 0d0757233b4cfa..41c5029f33d413 100644 --- a/tools/third_party/attrs/tests/test_mypy.yml +++ b/tools/third_party/attrs/tests/test_mypy.yml @@ -589,7 +589,7 @@ x: Optional[T] @classmethod def clsmeth(cls) -> None: - reveal_type(cls) # N: Revealed type is "Type[main.A[T`1]]" + reveal_type(cls) # N: Revealed type is "type[main.A[T`1]]" - case: testAttrsForwardReference main: | @@ -645,7 +645,7 @@ b: str = attr.ib() @classmethod def new(cls) -> A: - reveal_type(cls) # N: Revealed type is "Type[main.A]" + reveal_type(cls) # N: Revealed type is "type[main.A]" return cls(6, 'hello') @classmethod def bad(cls) -> A: @@ -680,7 +680,7 @@ @classmethod def foo(cls, x: Union[int, str]) -> Union[int, str]: - reveal_type(cls) # N: Revealed type is "Type[main.A]" + reveal_type(cls) # N: Revealed type is "type[main.A]" reveal_type(cls.other()) # N: Revealed type is "builtins.str" return x @@ -767,7 +767,7 @@ return 'hello' - case: testAttrsUsingBadConverter - regex: true + skip: sys.version_info[:2] < (3, 10) main: | import attr from typing import overload @@ -787,14 +787,14 @@ bad_overloaded: int = attr.ib(converter=bad_overloaded_converter) reveal_type(A) out: | - main:15: error: Cannot determine __init__ type from converter \[misc\] - main:15: error: Argument "converter" has incompatible type \"Callable\[\[\], str\]\"; expected (\"Callable\[\[Any\], Any\] \| None\"|\"Optional\[Callable\[\[Any\], Any\]\]\") \[arg-type\] - main:16: error: Cannot determine __init__ type from converter \[misc\] - main:16: error: Argument "converter" has incompatible type overloaded function; expected (\"Callable\[\[Any\], Any\] \| None\"|\"Optional\[Callable\[\[Any\], Any\]\]\") \[arg-type\] - main:17: note: Revealed type is "def (bad: Any, bad_overloaded: Any\) -> main.A" + main:15: error: Cannot determine __init__ type from converter [misc] + main:15: error: Argument "converter" has incompatible type "Callable[[], str]"; expected "Callable[[Any], Any] | Converter[Any, Any] | list[Callable[[Any], Any] | Converter[Any, Any]] | tuple[Callable[[Any], Any] | Converter[Any, Any]] | None" [arg-type] + main:16: error: Cannot determine __init__ type from converter [misc] + main:16: error: Argument "converter" has incompatible type overloaded function; expected "Callable[[Any], Any] | Converter[Any, Any] | list[Callable[[Any], Any] | Converter[Any, Any]] | tuple[Callable[[Any], Any] | Converter[Any, Any]] | None" [arg-type] + main:17: note: Revealed type is "def (bad: Any, bad_overloaded: Any) -> main.A" - case: testAttrsUsingBadConverterReprocess - regex: true + skip: sys.version_info[:2] < (3, 10) main: | import attr from typing import overload @@ -815,11 +815,11 @@ bad_overloaded: int = attr.ib(converter=bad_overloaded_converter) reveal_type(A) out: | - main:16: error: Cannot determine __init__ type from converter \[misc\] - main:16: error: Argument \"converter\" has incompatible type \"Callable\[\[\], str\]\"; expected (\"Callable\[\[Any\], Any\] \| None\"|\"Optional\[Callable\[\[Any\], Any\]\]\") \[arg-type\] - main:17: error: Cannot determine __init__ type from converter \[misc\] - main:17: error: Argument "converter" has incompatible type overloaded function; expected (\"Callable\[\[Any\], Any\] \| None\"|\"Optional\[Callable\[\[Any\], Any\]\]\") \[arg-type\] - main:18: note: Revealed type is "def (bad: Any, bad_overloaded: Any\) -> main.A" + main:16: error: Cannot determine __init__ type from converter [misc] + main:16: error: Argument "converter" has incompatible type "Callable[[], str]"; expected "Callable[[Any], Any] | Converter[Any, Any] | list[Callable[[Any], Any] | Converter[Any, Any]] | tuple[Callable[[Any], Any] | Converter[Any, Any]] | None" [arg-type] + main:17: error: Cannot determine __init__ type from converter [misc] + main:17: error: Argument "converter" has incompatible type overloaded function; expected "Callable[[Any], Any] | Converter[Any, Any] | list[Callable[[Any], Any] | Converter[Any, Any]] | tuple[Callable[[Any], Any] | Converter[Any, Any]] | None" [arg-type] + main:18: note: Revealed type is "def (bad: Any, bad_overloaded: Any) -> main.A" - case: testAttrsUsingUnsupportedConverter main: | @@ -874,6 +874,66 @@ o = C("1", "2", "3") o = C(1, 2, "3") +- case: testThreeArgConverterTypes + main: | + from typing import Any + from attrs import AttrsInstance, Attribute, Converter + + def my_converter(value: Any) -> str: + """A converter that only takes the value.""" + return str(value) + + def my_converter_with_self(value: Any, self: AttrsInstance) -> str: + """This converter takes the value and the self.""" + return str(value) + + + def my_converter_with_field(value: Any, field: Attribute) -> str: + """This converter takes the value and the field.""" + return str(value) + + reveal_type(Converter(my_converter)) + Converter(my_converter_with_self) + Converter(my_converter_with_field) + + reveal_type(Converter(my_converter_with_self, takes_self=True)) + Converter(my_converter, takes_self=True) + Converter(my_converter_with_field, takes_self=True) + + reveal_type(Converter(my_converter_with_field, takes_field=True)) + Converter(my_converter, takes_field=True) + Converter(my_converter_with_self, takes_field=True) + out: | + main:17: note: Revealed type is "attr.Converter[Any, builtins.str]" + main:18: error: Argument 1 to "Converter" has incompatible type "Callable[[Any, AttrsInstance], str]"; expected "Callable[[Any], str]" [arg-type] + main:19: error: Argument 1 to "Converter" has incompatible type "Callable[[Any, Attribute[Any]], str]"; expected "Callable[[Any], str]" [arg-type] + main:21: note: Revealed type is "attr.Converter[Any, builtins.str]" + main:22: error: No overload variant of "Converter" matches argument types "Callable[[Any], str]", "bool" [call-overload] + main:22: note: Possible overload variants: + main:22: note: def [In, Out] Converter(self, converter: Callable[[In], Out]) -> Converter[In, Out] + main:22: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance, Attribute[Any]], Out], *, takes_self: Literal[True], takes_field: Literal[True]) -> Converter[In, Out] + main:22: note: def [In, Out] Converter(self, converter: Callable[[In, Attribute[Any]], Out], *, takes_field: Literal[True]) -> Converter[In, Out] + main:22: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance], Out], *, takes_self: Literal[True]) -> Converter[In, Out] + main:23: error: No overload variant of "Converter" matches argument types "Callable[[Any, Attribute[Any]], str]", "bool" [call-overload] + main:23: note: Possible overload variants: + main:23: note: def [In, Out] Converter(self, converter: Callable[[In], Out]) -> Converter[In, Out] + main:23: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance, Attribute[Any]], Out], *, takes_self: Literal[True], takes_field: Literal[True]) -> Converter[In, Out] + main:23: note: def [In, Out] Converter(self, converter: Callable[[In, Attribute[Any]], Out], *, takes_field: Literal[True]) -> Converter[In, Out] + main:23: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance], Out], *, takes_self: Literal[True]) -> Converter[In, Out] + main:25: note: Revealed type is "attr.Converter[Any, builtins.str]" + main:26: error: No overload variant of "Converter" matches argument types "Callable[[Any], str]", "bool" [call-overload] + main:26: note: Possible overload variants: + main:26: note: def [In, Out] Converter(self, converter: Callable[[In], Out]) -> Converter[In, Out] + main:26: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance, Attribute[Any]], Out], *, takes_self: Literal[True], takes_field: Literal[True]) -> Converter[In, Out] + main:26: note: def [In, Out] Converter(self, converter: Callable[[In, Attribute[Any]], Out], *, takes_field: Literal[True]) -> Converter[In, Out] + main:26: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance], Out], *, takes_self: Literal[True]) -> Converter[In, Out] + main:27: error: No overload variant of "Converter" matches argument types "Callable[[Any, AttrsInstance], str]", "bool" [call-overload] + main:27: note: Possible overload variants: + main:27: note: def [In, Out] Converter(self, converter: Callable[[In], Out]) -> Converter[In, Out] + main:27: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance, Attribute[Any]], Out], *, takes_self: Literal[True], takes_field: Literal[True]) -> Converter[In, Out] + main:27: note: def [In, Out] Converter(self, converter: Callable[[In, Attribute[Any]], Out], *, takes_field: Literal[True]) -> Converter[In, Out] + main:27: note: def [In, Out] Converter(self, converter: Callable[[In, AttrsInstance], Out], *, takes_self: Literal[True]) -> Converter[In, Out] + - case: testAttrsCmpWithSubclasses regex: true main: | @@ -1411,4 +1471,17 @@ reveal_type(A) # N: Revealed type is "def () -> main.A" if has(A): - reveal_type(A) # N: Revealed type is "Type[attr.AttrsInstance]" + reveal_type(A) # N: Revealed type is "type[attr.AttrsInstance]" + +- case: testNothingType + regex: true + main: | + from typing import Optional + from attrs import NOTHING, NothingType + + def takes_nothing(arg: Optional[NothingType]) -> None: + return None + + takes_nothing(NOTHING) + takes_nothing(None) + takes_nothing(1) # E: Argument 1 to "takes_nothing" has incompatible type "Literal\[1\]"; expected "(Optional\[Literal\[_Nothing.NOTHING\]\]|Literal\[_Nothing.NOTHING\] \| None)" \[arg-type\] diff --git a/tools/third_party/attrs/tests/test_next_gen.py b/tools/third_party/attrs/tests/test_next_gen.py index 7d053d2143911f..41e534df0ce29a 100644 --- a/tools/third_party/attrs/tests/test_next_gen.py +++ b/tools/third_party/attrs/tests/test_next_gen.py @@ -14,6 +14,8 @@ import attr as _attr # don't use it by accident import attrs +from attr._compat import PY_3_11_PLUS + @attrs.define class C: @@ -36,7 +38,7 @@ def test_field_type(self): A = attrs.make_class("A", classFields) - assert int == attrs.fields(A).testint.type + assert int is attrs.fields(A).testint.type def test_no_slots(self): """ @@ -226,7 +228,7 @@ def test_auto_detect_eq(self): @attrs.define class C: def __eq__(self, o): - raise ValueError() + raise ValueError with pytest.raises(ValueError): C() == C() @@ -316,7 +318,7 @@ class MyException(Exception): with pytest.raises(MyException) as ei: try: - raise ValueError() + raise ValueError except ValueError: raise MyException("foo") from None @@ -332,7 +334,7 @@ class MyException(Exception): attrs.mutable, ], ) - def test_setting_traceback_on_exception(self, decorator): + def test_setting_exception_mutable_attributes(self, decorator): """ contextlib.contextlib (re-)sets __traceback__ on raised exceptions. @@ -348,12 +350,22 @@ def do_nothing(): yield with do_nothing(), pytest.raises(MyException) as ei: - raise MyException() + raise MyException assert isinstance(ei.value, MyException) # this should not raise an exception either ei.value.__traceback__ = ei.value.__traceback__ + ei.value.__cause__ = ValueError("cause") + ei.value.__context__ = TypeError("context") + ei.value.__suppress_context__ = True + ei.value.__suppress_context__ = False + ei.value.__notes__ = [] + del ei.value.__notes__ + + if PY_3_11_PLUS: + ei.value.add_note("note") + del ei.value.__notes__ def test_converts_and_validates_by_default(self): """ diff --git a/tools/third_party/attrs/tests/test_packaging.py b/tools/third_party/attrs/tests/test_packaging.py index 046ae4c39dd060..8090b4b81ba242 100644 --- a/tools/third_party/attrs/tests/test_packaging.py +++ b/tools/third_party/attrs/tests/test_packaging.py @@ -1,6 +1,7 @@ # SPDX-License-Identifier: MIT -import sys + +from importlib import metadata import pytest @@ -8,42 +9,12 @@ import attrs -if sys.version_info < (3, 8): - import importlib_metadata as metadata -else: - from importlib import metadata - - @pytest.fixture(name="mod", params=(attr, attrs)) def _mod(request): return request.param class TestLegacyMetadataHack: - def test_title(self, mod): - """ - __title__ returns attrs. - """ - with pytest.deprecated_call() as ws: - assert "attrs" == mod.__title__ - - assert ( - f"Accessing {mod.__name__}.__title__ is deprecated" - in ws.list[0].message.args[0] - ) - - def test_copyright(self, mod): - """ - __copyright__ returns the correct blurp. - """ - with pytest.deprecated_call() as ws: - assert "Copyright (c) 2015 Hynek Schlawack" == mod.__copyright__ - - assert ( - f"Accessing {mod.__name__}.__copyright__ is deprecated" - in ws.list[0].message.args[0] - ) - def test_version(self, mod, recwarn): """ __version__ returns the correct version and doesn't warn. @@ -52,67 +23,6 @@ def test_version(self, mod, recwarn): assert [] == recwarn.list - def test_description(self, mod): - """ - __description__ returns the correct description. - """ - with pytest.deprecated_call() as ws: - assert "Classes Without Boilerplate" == mod.__description__ - - assert ( - f"Accessing {mod.__name__}.__description__ is deprecated" - in ws.list[0].message.args[0] - ) - - @pytest.mark.parametrize("name", ["__uri__", "__url__"]) - def test_uri(self, mod, name): - """ - __uri__ & __url__ returns the correct project URL. - """ - with pytest.deprecated_call() as ws: - assert "https://www.attrs.org/" == getattr(mod, name) - - assert ( - f"Accessing {mod.__name__}.{name} is deprecated" - in ws.list[0].message.args[0] - ) - - def test_author(self, mod): - """ - __author__ returns Hynek. - """ - with pytest.deprecated_call() as ws: - assert "Hynek Schlawack" == mod.__author__ - - assert ( - f"Accessing {mod.__name__}.__author__ is deprecated" - in ws.list[0].message.args[0] - ) - - def test_email(self, mod): - """ - __email__ returns Hynek's email address. - """ - with pytest.deprecated_call() as ws: - assert "hs@ox.cx" == mod.__email__ - - assert ( - f"Accessing {mod.__name__}.__email__ is deprecated" - in ws.list[0].message.args[0] - ) - - def test_license(self, mod): - """ - __license__ returns MIT. - """ - with pytest.deprecated_call() as ws: - assert "MIT" == mod.__license__ - - assert ( - f"Accessing {mod.__name__}.__license__ is deprecated" - in ws.list[0].message.args[0] - ) - def test_does_not_exist(self, mod): """ Asking for unsupported dunders raises an AttributeError. @@ -125,7 +35,7 @@ def test_does_not_exist(self, mod): def test_version_info(self, recwarn, mod): """ - ___version_info__ is not deprected, therefore doesn't raise a warning + ___version_info__ is not deprecated, therefore doesn't raise a warning and parses correctly. """ assert isinstance(mod.__version_info__, attr.VersionInfo) diff --git a/tools/third_party/attrs/tests/test_pattern_matching.py b/tools/third_party/attrs/tests/test_pattern_matching.py index 3855d6a379c24b..fc8546b3b9118e 100644 --- a/tools/third_party/attrs/tests/test_pattern_matching.py +++ b/tools/third_party/attrs/tests/test_pattern_matching.py @@ -1,6 +1,5 @@ # SPDX-License-Identifier: MIT -# Keep this file SHORT, until Black can handle it. import pytest import attr diff --git a/tools/third_party/attrs/tests/test_pyright.py b/tools/third_party/attrs/tests/test_pyright.py index 800d6099fab0fe..b974114ce46686 100644 --- a/tools/third_party/attrs/tests/test_pyright.py +++ b/tools/third_party/attrs/tests/test_pyright.py @@ -10,8 +10,6 @@ import pytest -import attrs - pytestmark = [ pytest.mark.skipif( @@ -20,21 +18,16 @@ ] -@attrs.frozen -class PyrightDiagnostic: - severity: str - message: str - - -def parse_pyright_output(test_file: Path) -> set[PyrightDiagnostic]: +def parse_pyright_output(test_file: Path) -> set[tuple[str, str]]: pyright = subprocess.run( # noqa: PLW1510 ["pyright", "--outputjson", str(test_file)], capture_output=True ) pyright_result = json.loads(pyright.stdout) + # We use tuples instead of proper classes to get nicer diffs from pytest. return { - PyrightDiagnostic(d["severity"], d["message"]) + (d["severity"], d["message"]) for d in pyright_result["generalDiagnostics"] } @@ -49,41 +42,38 @@ def test_pyright_baseline(): diagnostics = parse_pyright_output(test_file) - # Expected diagnostics as per pyright 1.1.311 expected_diagnostics = { - PyrightDiagnostic( - severity="information", - message='Type of "Define.__init__" is' - ' "(self: Define, a: str, b: int) -> None"', + ( + "information", + 'Type of "Define.__init__" is "(self: Define, a: str, b: int) -> None"', ), - PyrightDiagnostic( - severity="information", - message='Type of "DefineConverter.__init__" is ' + ( + "information", + 'Type of "DefineConverter.__init__" is ' '"(self: DefineConverter, with_converter: str | Buffer | ' 'SupportsInt | SupportsIndex | SupportsTrunc) -> None"', ), - PyrightDiagnostic( - severity="error", - message='Cannot assign member "a" for type ' - '"Frozen"\n\xa0\xa0"Frozen" is frozen\n\xa0\xa0\xa0\xa0Member "__set__" is unknown', + ( + "error", + 'Cannot assign to attribute "a" for class ' + '"Frozen"\n\xa0\xa0Attribute "a" is read-only', ), - PyrightDiagnostic( - severity="information", - message='Type of "d.a" is "Literal[\'new\']"', + ( + "information", + 'Type of "d.a" is "Literal[\'new\']"', ), - PyrightDiagnostic( - severity="error", - message='Cannot assign member "a" for type ' - '"FrozenDefine"\n\xa0\xa0"FrozenDefine" is frozen\n\xa0\xa0\xa0\xa0' - 'Member "__set__" is unknown', + ( + "error", + 'Cannot assign to attribute "a" for class ' + '"FrozenDefine"\n\xa0\xa0Attribute "a" is read-only', ), - PyrightDiagnostic( - severity="information", - message='Type of "d2.a" is "Literal[\'new\']"', + ( + "information", + 'Type of "d2.a" is "Literal[\'new\']"', ), - PyrightDiagnostic( - severity="information", - message='Type of "af.__init__" is "(_a: int) -> None"', + ( + "information", + 'Type of "af.__init__" is "(_a: int) -> None"', ), } @@ -110,9 +100,9 @@ def test_pyright_attrsinstance_compat(tmp_path): diagnostics = parse_pyright_output(test_pyright_attrsinstance_compat_path) expected_diagnostics = { - PyrightDiagnostic( - severity="information", - message='Type of "attrs.AttrsInstance" is "type[AttrsInstance]"', - ), + ( + "information", + 'Type of "attrs.AttrsInstance" is "type[AttrsInstance]"', + ) } assert diagnostics == expected_diagnostics diff --git a/tools/third_party/attrs/tests/test_setattr.py b/tools/third_party/attrs/tests/test_setattr.py index c7b90daee68b91..9b7c5e170b5946 100644 --- a/tools/third_party/attrs/tests/test_setattr.py +++ b/tools/third_party/attrs/tests/test_setattr.py @@ -109,16 +109,34 @@ def test_pipe(self): used. They can be supplied using the pipe functions or by passing a list to on_setattr. """ + taken = None + + def takes_all(val, instance, attrib): + nonlocal taken + taken = val, instance, attrib + + return val s = [setters.convert, lambda _, __, nv: nv + 1] @attr.s class Piped: - x1 = attr.ib(converter=int, on_setattr=setters.pipe(*s)) + x1 = attr.ib( + converter=[ + attr.Converter( + takes_all, takes_field=True, takes_self=True + ), + int, + ], + on_setattr=setters.pipe(*s), + ) x2 = attr.ib(converter=int, on_setattr=s) p = Piped("41", "22") + assert ("41", p) == taken[:-1] + assert "x1" == taken[-1].name + assert 41 == p.x1 assert 22 == p.x2 @@ -417,3 +435,19 @@ def test_docstring(self): "Method generated by attrs for class WithOnSetAttrHook." == WithOnSetAttrHook.__setattr__.__doc__ ) + + def test_setattr_converter_piped(self): + """ + If a converter is used, it is piped through the on_setattr hooks. + + Regression test for https://github.com/python-attrs/attrs/issues/1327 + """ + + @attr.define # converter on setattr is implied in NG + class C: + x = attr.field(converter=[int]) + + c = C("1") + c.x = "2" + + assert 2 == c.x diff --git a/tools/third_party/attrs/tests/test_slots.py b/tools/third_party/attrs/tests/test_slots.py index 26365ab0d2bcd8..9af18e5ee871fc 100644 --- a/tools/third_party/attrs/tests/test_slots.py +++ b/tools/third_party/attrs/tests/test_slots.py @@ -3,6 +3,7 @@ """ Unit tests for slots-related functionality. """ + import functools import pickle import weakref @@ -14,7 +15,7 @@ import attr import attrs -from attr._compat import PY_3_8_PLUS, PYPY +from attr._compat import PY_3_14_PLUS, PYPY # Pympler doesn't work on PyPy. @@ -50,7 +51,7 @@ def my_super(self): return super().__repr__() -@attr.s(slots=True, hash=True) +@attr.s(slots=True, unsafe_hash=True) class C1Slots: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @@ -133,7 +134,7 @@ def test_inheritance_from_nonslots(): the benefits of slotted classes, but it should still work. """ - @attr.s(slots=True, hash=True) + @attr.s(slots=True, unsafe_hash=True) class C2Slots(C1): z = attr.ib() @@ -196,7 +197,7 @@ def staticmethod(): these={"x": attr.ib(), "y": attr.ib(), "z": attr.ib()}, init=False, slots=True, - hash=True, + unsafe_hash=True, )(SimpleOrdinaryClass) c2 = C2Slots(x=1, y=2, z="test") @@ -229,11 +230,11 @@ def test_inheritance_from_slots(): Inheriting from an attrs slotted class works. """ - @attr.s(slots=True, hash=True) + @attr.s(slots=True, unsafe_hash=True) class C2Slots(C1Slots): z = attr.ib() - @attr.s(slots=True, hash=True) + @attr.s(slots=True, unsafe_hash=True) class C2(C1): z = attr.ib() @@ -275,13 +276,13 @@ def test_inheritance_from_slots_with_attribute_override(): class HasXSlot: __slots__ = ("x",) - @attr.s(slots=True, hash=True) + @attr.s(slots=True, unsafe_hash=True) class C2Slots(C1Slots): # y re-defined here but it shouldn't get a slot y = attr.ib() z = attr.ib() - @attr.s(slots=True, hash=True) + @attr.s(slots=True, unsafe_hash=True) class NonAttrsChild(HasXSlot): # Parent class has slot for "x" already, so we skip it x = attr.ib() @@ -337,7 +338,12 @@ def test_bare_inheritance_from_slots(): """ @attr.s( - init=False, eq=False, order=False, hash=False, repr=False, slots=True + init=False, + eq=False, + order=False, + unsafe_hash=False, + repr=False, + slots=True, ) class C1BareSlots: x = attr.ib(validator=attr.validators.instance_of(int)) @@ -354,7 +360,7 @@ def classmethod(cls): def staticmethod(): return "staticmethod" - @attr.s(init=False, eq=False, order=False, hash=False, repr=False) + @attr.s(init=False, eq=False, order=False, unsafe_hash=False, repr=False) class C1Bare: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @@ -370,11 +376,11 @@ def classmethod(cls): def staticmethod(): return "staticmethod" - @attr.s(slots=True, hash=True) + @attr.s(slots=True, unsafe_hash=True) class C2Slots(C1BareSlots): z = attr.ib() - @attr.s(slots=True, hash=True) + @attr.s(slots=True, unsafe_hash=True) class C2(C1Bare): z = attr.ib() @@ -716,7 +722,6 @@ def f(self): assert B(17).f == 289 -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_property_allows_call(): """ cached_property in slotted class allows call. @@ -733,7 +738,6 @@ def f(self): assert A(11).f == 11 -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_property_class_does_not_have__dict__(): """ slotted class with cached property has no __dict__ attribute. @@ -751,7 +755,6 @@ def f(self): assert "__dict__" not in dir(A) -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_property_works_on_frozen_isntances(): """ Infers type of cached property. @@ -768,7 +771,9 @@ def f(self) -> int: assert A(x=1).f == 1 -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") +@pytest.mark.xfail( + PY_3_14_PLUS, reason="3.14 returns weird annotation for cached_properies" +) def test_slots_cached_property_infers_type(): """ Infers type of cached property. @@ -785,7 +790,6 @@ def f(self) -> int: assert A.__annotations__ == {"x": int, "f": int} -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_property_with_empty_getattr_raises_attribute_error_of_requested(): """ Ensures error information is not lost. @@ -806,7 +810,44 @@ def f(self): a.z -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") +def test_slots_cached_property_raising_attributeerror(): + """ + Ensures AttributeError raised by a property is preserved by __getattr__() + implementation. + + Regression test for issue https://github.com/python-attrs/attrs/issues/1230 + """ + + @attr.s(slots=True) + class A: + x = attr.ib() + + @functools.cached_property + def f(self): + return self.p + + @property + def p(self): + raise AttributeError("I am a property") + + @functools.cached_property + def g(self): + return self.q + + @property + def q(self): + return 2 + + a = A(1) + with pytest.raises(AttributeError, match=r"^I am a property$"): + a.p + with pytest.raises(AttributeError, match=r"^I am a property$"): + a.f + + assert a.g == 2 + assert a.q == 2 + + def test_slots_cached_property_with_getattr_calls_getattr_for_missing_attributes(): """ Ensure __getattr__ implementation is maintained for non cached_properties. @@ -828,7 +869,6 @@ def __getattr__(self, item): assert a.z == "z" -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_getattr_in_superclass__is_called_for_missing_attributes_when_cached_property_present(): """ Ensure __getattr__ implementation is maintained in subclass. @@ -852,7 +892,6 @@ def f(self): assert b.z == "z" -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_getattr_in_subclass_gets_superclass_cached_property(): """ Ensure super() in __getattr__ is not broken through cached_property re-write. @@ -883,7 +922,6 @@ def __getattr__(self, item): assert b.z == "z" -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_sub_class_with_independent_cached_properties_both_work(): """ Subclassing shouldn't break cached properties. @@ -907,7 +945,6 @@ def g(self): assert B(1).g == 2 -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_with_multiple_cached_property_subclasses_works(): """ Multiple sub-classes shouldn't break cached properties. @@ -943,7 +980,24 @@ class AB(A, B): assert ab.h == "h" -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") +def test_slotted_cached_property_can_access_super(): + """ + Multiple sub-classes shouldn't break cached properties. + """ + + @attr.s(slots=True) + class A: + x = attr.ib(kw_only=True) + + @attr.s(slots=True) + class B(A): + @functools.cached_property + def f(self): + return super().x * 2 + + assert B(x=1).f == 2 + + def test_slots_sub_class_avoids_duplicated_slots(): """ Duplicating the slots is a waste of memory. @@ -967,7 +1021,6 @@ def f(self): assert B.__slots__ == () -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_sub_class_with_actual_slot(): """ A sub-class can have an explicit attrs field that replaces a cached property. @@ -989,7 +1042,6 @@ class B(A): assert B.__slots__ == () -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_property_is_not_called_at_construction(): """ A cached property function should only be called at property access point. @@ -1010,7 +1062,6 @@ def f(self): assert call_count == 0 -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_property_repeat_call_only_once(): """ A cached property function should be called only once, on repeated attribute access. @@ -1033,7 +1084,6 @@ def f(self): assert call_count == 1 -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_property_called_independent_across_instances(): """ A cached property value should be specific to the given instance. @@ -1054,7 +1104,6 @@ def f(self): assert obj_2.f == 2 -@pytest.mark.skipif(not PY_3_8_PLUS, reason="cached_property is 3.8+") def test_slots_cached_properties_work_independently(): """ Multiple cached properties should work independently. diff --git a/tools/third_party/attrs/tests/test_validators.py b/tools/third_party/attrs/tests/test_validators.py index 4327f825188d58..45e7050d36ff2f 100644 --- a/tools/third_party/attrs/tests/test_validators.py +++ b/tools/third_party/attrs/tests/test_validators.py @@ -4,7 +4,6 @@ Tests for `attr.validators`. """ - import re import pytest @@ -30,25 +29,12 @@ min_len, not_, optional, - provides, + or_, ) from .utils import simple_attr -@pytest.fixture(scope="module") -def zope_interface(): - """Provides ``zope.interface`` if available, skipping the test if not.""" - try: - import zope.interface - except ImportError: - raise pytest.skip( - "zope-related tests skipped when zope.interface is not installed" - ) - - return zope.interface - - class TestDisableValidators: @pytest.fixture(autouse=True) def _reset_default(self): @@ -314,79 +300,6 @@ class C: assert C.__attrs_attrs__[0].validator == C.__attrs_attrs__[1].validator -@pytest.fixture(scope="module") -def ifoo(zope_interface): - """Provides a test ``zope.interface.Interface`` in ``zope`` tests.""" - - class IFoo(zope_interface.Interface): - """ - An interface. - """ - - def f(): - """ - A function called f. - """ - - return IFoo - - -class TestProvides: - """ - Tests for `provides`. - """ - - def test_in_all(self): - """ - Verify that this validator is in ``__all__``. - """ - assert provides.__name__ in validator_module.__all__ - - def test_success(self, zope_interface, ifoo): - """ - Nothing happens if value provides requested interface. - """ - - @zope_interface.implementer(ifoo) - class C: - def f(self): - pass - - with pytest.deprecated_call(): - v = provides(ifoo) - - v(None, simple_attr("x"), C()) - - def test_fail(self, ifoo): - """ - Raises `TypeError` if interfaces isn't provided by value. - """ - value = object() - a = simple_attr("x") - - with pytest.deprecated_call(): - v = provides(ifoo) - - with pytest.raises(TypeError) as e: - v(None, a, value) - - assert ( - f"'x' must provide {ifoo!r} which {value!r} doesn't.", - a, - ifoo, - value, - ) == e.value.args - - def test_repr(self, ifoo): - """ - Returned validator has a useful `__repr__`. - """ - with pytest.deprecated_call(): - v = provides(ifoo) - - assert (f"") == repr(v) - - @pytest.mark.parametrize( "validator", [ @@ -443,14 +356,14 @@ def test_repr(self, validator): if isinstance(validator, list): repr_s = ( - ">]) or None>" - ).format(func=repr(always_pass)) + ) elif isinstance(validator, tuple): repr_s = ( - ">)) or None>" - ).format(func=repr(always_pass)) + ) else: repr_s = ( "") == repr(v) + def test_is_hashable(self): + """ + `in_` is hashable, so fields using it can be used with the include and + exclude filters. + """ + + @attr.s + class C: + x: int = attr.ib(validator=attr.validators.in_({1, 2})) + + i = C(2) + + attr.asdict(i, filter=attr.filters.include(lambda val: True)) + attr.asdict(i, filter=attr.filters.exclude(lambda val: True)) + @pytest.fixture( name="member_validator", @@ -1145,8 +1074,7 @@ def test_repr(self): v = not_(wrapped) assert ( - f"" + f"" ) == repr(v) def test_success_because_fails(self): @@ -1180,8 +1108,8 @@ def always_passes(inst, attr, value): assert ( ( - "not_ validator child '{!r}' did not raise a captured error" - ).format(always_passes), + f"not_ validator child '{always_passes!r}' did not raise a captured error" + ), a, always_passes, input_value, @@ -1276,8 +1204,8 @@ def test_composable_with_instance_of_fail(self): assert ( ( - "not_ validator child '{!r}' did not raise a captured error" - ).format(instance_of((int, float))), + f"not_ validator child '{instance_of((int, float))!r}' did not raise a captured error" + ), a, wrapped, input_value, @@ -1348,3 +1276,38 @@ def test_bad_exception_args(self): "'exc_types' must be a subclass of " "(got )." ) == e.value.args[0] + + +class TestOr: + def test_in_all(self): + """ + Verify that this validator is in ``__all__``. + """ + assert or_.__name__ in validator_module.__all__ + + def test_success(self): + """ + Succeeds if at least one of wrapped validators succeed. + """ + v = or_(instance_of(str), always_pass) + + v(None, simple_attr("test"), 42) + + def test_fail(self): + """ + Fails if all wrapped validators fail. + """ + v = or_(instance_of(str), always_fail) + + with pytest.raises(ValueError): + v(None, simple_attr("test"), 42) + + def test_repr(self): + """ + Returned validator has a useful `__repr__`. + """ + v = or_(instance_of(int), instance_of(str)) + assert ( + ">, >)>" + ) == repr(v) diff --git a/tools/third_party/attrs/tests/typing_example.py b/tools/third_party/attrs/tests/typing_example.py index 2124912c8d58e5..82a5c253b671cb 100644 --- a/tools/third_party/attrs/tests/typing_example.py +++ b/tools/third_party/attrs/tests/typing_example.py @@ -133,40 +133,52 @@ class AliasExample: attr.fields(AliasExample).without_alias.alias attr.fields(AliasExample)._with_alias.alias + # Converters -# XXX: Currently converters can only be functions so none of this works -# although the stubs should be correct. -# @attr.s -# class ConvCOptional: -# x: Optional[int] = attr.ib(converter=attr.converters.optional(int)) + +@attr.s +class ConvCOptional: + x: int | None = attr.ib(converter=attr.converters.optional(int)) -# ConvCOptional(1) -# ConvCOptional(None) +ConvCOptional(1) +ConvCOptional(None) +# XXX: Fails with E: Unsupported converter, only named functions, types and lambdas are currently supported [misc] +# See https://github.com/python/mypy/issues/15736 +# +# @attr.s +# class ConvCPipe: +# x: str = attr.ib(converter=attr.converters.pipe(int, str)) +# +# +# ConvCPipe(3.4) +# ConvCPipe("09") +# +# # @attr.s # class ConvCDefaultIfNone: # x: int = attr.ib(converter=attr.converters.default_if_none(42)) - - +# +# # ConvCDefaultIfNone(1) # ConvCDefaultIfNone(None) -# @attr.s -# class ConvCToBool: -# x: int = attr.ib(converter=attr.converters.to_bool) +@attr.s +class ConvCToBool: + x: int = attr.ib(converter=attr.converters.to_bool) -# ConvCToBool(1) -# ConvCToBool(True) -# ConvCToBool("on") -# ConvCToBool("yes") -# ConvCToBool(0) -# ConvCToBool(False) -# ConvCToBool("n") +ConvCToBool(1) +ConvCToBool(True) +ConvCToBool("on") +ConvCToBool("yes") +ConvCToBool(0) +ConvCToBool(False) +ConvCToBool("n") # Validators @@ -216,6 +228,9 @@ class Validated: k: int | str | C = attr.ib( validator=attrs.validators.instance_of((int, C, str)) ) + kk: int | str | C = attr.ib( + validator=attrs.validators.instance_of(int | C | str) + ) l: Any = attr.ib( validator=attr.validators.not_(attr.validators.in_("abc")) @@ -254,7 +269,7 @@ class Validated2: @attrs.define class Validated3: - num: int = attr.field(validator=attr.validators.ge(0)) + num: int = attrs.field(validator=attrs.validators.ge(0)) with attr.validators.disabled(): diff --git a/tools/third_party/attrs/tests/utils.py b/tools/third_party/attrs/tests/utils.py index 9e678f05f17c62..eefcbd242501ab 100644 --- a/tools/third_party/attrs/tests/utils.py +++ b/tools/third_party/attrs/tests/utils.py @@ -4,7 +4,6 @@ Common helper functions for tests. """ - from attr import Attribute from attr._make import NOTHING, _default_init_alias_for, make_class @@ -13,7 +12,7 @@ def simple_class( eq=False, order=False, repr=False, - hash=False, + unsafe_hash=False, str=False, slots=False, frozen=False, @@ -28,7 +27,7 @@ def simple_class( eq=eq or order, order=order, repr=repr, - hash=hash, + unsafe_hash=unsafe_hash, init=True, slots=slots, str=str, diff --git a/tools/third_party/attrs/tox.ini b/tools/third_party/attrs/tox.ini index 54724faaafbf6c..585c3c7c9e0cf3 100644 --- a/tools/third_party/attrs/tox.ini +++ b/tools/third_party/attrs/tox.ini @@ -2,25 +2,22 @@ min_version = 4 env_list = pre-commit, - py3{7,8,9,10,11,12}-tests, - py3{8,9,10,11,12}-mypy, - pypy3, + py3{8,9,10,11,12,13}-tests, + py3{10,11,12,13}-mypy, + pypy3-tests, pyright, - docs, + docs-{sponsors,doctests}, changelog, coverage-report -[testenv:.pkg] +[pkgenv] pass_env = SETUPTOOLS_SCM_PRETEND_VERSION [testenv] package = wheel wheel_build_env = .pkg -pass_env = - FORCE_COLOR - NO_COLOR extras = tests: tests mypy: tests-mypy @@ -29,22 +26,26 @@ commands = mypy: mypy tests/typing_example.py mypy: mypy src/attrs/__init__.pyi src/attr/__init__.pyi src/attr/_typing_compat.pyi src/attr/_version_info.pyi src/attr/converters.pyi src/attr/exceptions.pyi src/attr/filters.pyi src/attr/setters.pyi src/attr/validators.pyi -[testenv:py3{7,10,12}-tests] +[testenv:pypy3-tests] +extras = tests +commands = pytest tests/test_functional.py + +[testenv:py3{8,10,13}-tests] extras = cov # Python 3.6+ has a number of compile-time warnings on invalid string escapes. -# PYTHONWARNINGS=d and --no-compile below make them visible during the tox run. +# PYTHONWARNINGS=d makes them visible during the tox run. set_env = COVERAGE_PROCESS_START={toxinidir}/pyproject.toml PYTHONWARNINGS=d -install_command = python -Im pip install --no-compile {opts} {packages} commands_pre = python -c 'import pathlib; pathlib.Path("{env_site_packages_dir}/cov.pth").write_text("import coverage; coverage.process_startup()")' -commands = coverage run -m pytest {posargs:-n auto} - +# We group xdist execution by file because otherwise the Mypy tests have race conditions. +commands = coverage run -m pytest {posargs:-n auto --dist loadfile} [testenv:coverage-report] # Keep base_python in-sync with .python-version-default -base_python = py312 -depends = py3{7,10,11} +base_python = py313 +# Keep depends in-sync with testenv above that has cov extra. +depends = py3{8,10,13}-tests skip_install = true deps = coverage[toml]>=5.3 commands = @@ -52,19 +53,30 @@ commands = coverage report -[testenv:docs] -# Keep base_python in-sync with ci.yml/docs and .readthedocs.yaml. -base_python = py312 +[testenv:codspeed] +extras = benchmark +pass_env = + CODSPEED_TOKEN + CODSPEED_ENV + ARCH + PYTHONHASHSEED + PYTHONMALLOC +commands = pytest --codspeed -n auto bench/test_benchmarks.py + + +[testenv:docs-{build,doctests,linkcheck}] +# Keep base_python in sync with .readthedocs.yaml. +base_python = py313 extras = docs commands = - sphinx-build -n -T -W -b html -d {envtmpdir}/doctrees docs docs/_build/html - sphinx-build -n -T -W -b doctest -d {envtmpdir}/doctrees docs docs/_build/html - + build: sphinx-build -n -T -W -b html -d {envtmpdir}/doctrees docs {posargs:docs/_build/}html + doctests: sphinx-build -n -T -W -b doctest -d {envtmpdir}/doctrees docs {posargs:docs/_build/}html + linkcheck: sphinx-build -W -b linkcheck -d {envtmpdir}/doctrees docs docs/_build/html [testenv:docs-watch] package = editable -base_python = {[testenv:docs]base_python} -extras = {[testenv:docs]extras} +base_python = {[testenv:docs-build]base_python} +extras = {[testenv:docs-build]extras} deps = watchfiles commands = watchfiles \ @@ -73,29 +85,31 @@ commands = src \ docs - -[testenv:docs-linkcheck] -package = editable -base_python = {[testenv:docs]base_python} -extras = {[testenv:docs]extras} -commands = sphinx-build -W -b linkcheck -d {envtmpdir}/doctrees docs docs/_build/html +[testenv:docs-sponsors] +description = Ensure sponsor logos are up to date. +deps = cogapp +commands = cog -rP README.md docs/index.md [testenv:pre-commit] skip_install = true -deps = pre-commit +deps = pre-commit-uv commands = pre-commit run --all-files [testenv:changelog] +# See https://github.com/sphinx-contrib/sphinxcontrib-towncrier/issues/92 +# Pin also present in pyproject.toml deps = towncrier skip_install = true -commands = towncrier build --version main --draft +commands = + towncrier --version + towncrier build --version main --draft [testenv:pyright] extras = tests -deps = pyright +deps = pyright>=1.1.380 commands = pytest tests/test_pyright.py -vv @@ -109,6 +123,5 @@ allowlist_externals = commands = rm -rf attrs.docset attrs.tgz docs/_build sphinx-build -n -T -W -b html -d {envtmpdir}/doctrees docs docs/_build/html - doc2dash --index-page index.html --icon docs/_static/docset-icon.png --online-redirect-url https://www.attrs.org/en/latest/ docs/_build/html - cp docs/_static/docset-icon@2x.png attrs.docset/icon@2x.png + doc2dash --index-page index.html --icon docs/_static/docset-icon.png --icon-2x docs/_static/docset-icon@2x.png --online-redirect-url https://www.attrs.org/en/latest/ docs/_build/html tar --exclude='.DS_Store' -cvzf attrs.tgz attrs.docset