diff --git a/packages/pynumaflow/Makefile b/packages/pynumaflow/Makefile index 29a454be..1c02a581 100644 --- a/packages/pynumaflow/Makefile +++ b/packages/pynumaflow/Makefile @@ -34,3 +34,42 @@ proto: poetry run python3 -m grpc_tools.protoc -Ipynumaflow/proto/sideinput=pynumaflow/proto/sideinput -Ipynumaflow/proto/common=pynumaflow/proto/common --pyi_out=. --python_out=. --grpc_python_out=. pynumaflow/proto/sideinput/*.proto poetry run python3 -m grpc_tools.protoc -Ipynumaflow/proto/sourcer=pynumaflow/proto/sourcer -Ipynumaflow/proto/common=pynumaflow/proto/common --pyi_out=. --python_out=. --grpc_python_out=. pynumaflow/proto/sourcer/*.proto poetry run python3 -m grpc_tools.protoc -Ipynumaflow/proto/accumulator=pynumaflow/proto/accumulator -Ipynumaflow/proto/common=pynumaflow/proto/common --pyi_out=. --python_out=. --grpc_python_out=. pynumaflow/proto/accumulator/*.proto + + +# ============================================================================ +# Documentation targets +# ============================================================================ + +.PHONY: docs docs-serve docs-build docs-deploy-dev docs-deploy-version docs-list docs-set-default docs-delete + +docs: docs-serve ## Alias for docs-serve + +docs-serve: ## Serve documentation locally with hot-reload (http://localhost:8000) + poetry run mkdocs serve + +docs-build: ## Build documentation locally + poetry run mkdocs build + +docs-deploy-dev: ## Deploy dev docs to docs-site branch + poetry run mike deploy -b docs-site dev --push + +docs-deploy-version: ## Deploy versioned docs (usage: make docs-deploy-version VERSION=0.11) +ifndef VERSION + $(error VERSION is required. Usage: make docs-deploy-version VERSION=0.11) +endif + poetry run mike deploy -b docs-site $(VERSION) latest --update-aliases --push + +docs-list: ## List all deployed documentation versions + poetry run mike list -b docs-site + +docs-set-default: ## Set the default documentation version to 'latest' + poetry run mike set-default -b docs-site latest --push + +docs-delete: ## Delete a documentation version (usage: make docs-delete VERSION=0.10) +ifndef VERSION + $(error VERSION is required. Usage: make docs-delete VERSION=0.10) +endif + poetry run mike delete -b docs-site $(VERSION) --push + +docs-setup: ## Install documentation dependencies + poetry install --with docs --no-root \ No newline at end of file diff --git a/packages/pynumaflow/README.md b/packages/pynumaflow/README.md index 6a25f150..0a3f9626 100644 --- a/packages/pynumaflow/README.md +++ b/packages/pynumaflow/README.md @@ -23,7 +23,7 @@ To build the package locally, run the following command from the root of the pro ```bash make setup -```` +``` To run unit tests: ```bash @@ -57,6 +57,7 @@ There are different types of gRPC server mechanisms which can be used to serve t These have different functionalities and are used for different use cases. Currently we support the following server types: + - Sync Server - Asyncronous Server - MultiProcessing Server @@ -64,15 +65,13 @@ Currently we support the following server types: Not all of the above are supported for all UDFs, UDSource and UDSinks. For each of the UDFs, UDSource and UDSinks, there are seperate classes for each of the server types. -This helps in keeping the interface simple and easy to use, and the user can start the specific server type based -on the use case. +This helps in keeping the interface simple and easy to use, and the user can start the specific server type based on the use case. #### SyncServer Syncronous Server is the simplest server type. It is a multithreaded threaded server which can be used for simple UDFs and UDSinks. -Here the server will invoke the handler function for each message. The messaging is synchronous and the server will wait -for the handler to return before processing the next message. +Here the server will invoke the handler function for each message. The messaging is synchronous and the server will wait for the handler to return before processing the next message. ``` grpc_server = MapServer(handler) @@ -83,13 +82,13 @@ grpc_server = MapServer(handler) Asyncronous Server is a multi threaded server which can be used for UDFs which are asyncronous. Here we utilize the asyncronous capabilities of Python to process multiple messages in parallel. The server will invoke the handler function for each message. The messaging is asyncronous and the server will not wait for the handler to return before processing the next message. Thus this server type is useful for UDFs which are asyncronous. The handler function for such a server should be an async function. -``` +```py grpc_server = MapAsyncServer(handler) ``` #### MultiProcessServer -MultiProcess Server is a multi process server which can be used for UDFs which are CPU intensive. Here we utilize the multi process capabilities of Python to process multiple messages in parallel by forking multiple servers in different processes. +MultiProcess Server is a multi process server which can be used for UDFs which are CPU intensive. Here we utilize the multi process capabilities of Python to process multiple messages in parallel by forking multiple servers in different processes. The server will invoke the handler function for each message. Individually at the server level the messaging is synchronous and the server will wait for the handler to return before processing the next message. But since we have multiple servers running in parallel, the overall messaging also executes in parallel. This could be an alternative to creating multiple replicas of the same UDF container as here we are using the multi processing capabilities of the system to process multiple messages in parallel but within the same container. @@ -140,7 +139,8 @@ should follow the same signature. For using the class based handlers the user can inherit from the base handler class for each of the functionalities and implement the handler function. The base handler class for each of the functionalities has the same signature as the handler function for the respective server type. -The list of base handler classes for each of the functionalities is given below - +The list of base handler classes for each of the functionalities is given below: + - UDFs - Map - Mapper @@ -159,5 +159,5 @@ The list of base handler classes for each of the functionalities is given below - SideInput - SideInput -More details about the signature of the handler function for each of the server types is given in the +More details about the signature of the handler function for each of the server types is given in the documentation of the respective server type. diff --git a/packages/pynumaflow/docs/api/accumulator.md b/packages/pynumaflow/docs/api/accumulator.md new file mode 100644 index 00000000..26268849 --- /dev/null +++ b/packages/pynumaflow/docs/api/accumulator.md @@ -0,0 +1,15 @@ +# Accumulator + +This module offers tools for accumulating and processing data while managing state. With it, you can: + +- Accumulate data over time +- Maintain state across messages +- Process accumulated data + +## Classes + +::: pynumaflow.accumulator + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/batchmapper.md b/packages/pynumaflow/docs/api/batchmapper.md new file mode 100644 index 00000000..006b3820 --- /dev/null +++ b/packages/pynumaflow/docs/api/batchmapper.md @@ -0,0 +1,11 @@ +# Batch Mapper + +The Batch Mapper module offers tools for building BatchMap UDFs, allowing you to process multiple messages simultaneously. This enables more efficient handling of workloads such as bulk API requests or batch database operations by grouping messages and processing them together in a single operation. + +## Classes + +::: pynumaflow.batchmapper + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/index.md b/packages/pynumaflow/docs/api/index.md new file mode 100644 index 00000000..dad7fb6d --- /dev/null +++ b/packages/pynumaflow/docs/api/index.md @@ -0,0 +1,18 @@ +# API Reference + +This section provides detailed API documentation for all pynumaflow modules. + +## Modules + +| Module | Description | +|--------|-------------| +| [Sourcer](sourcer.md) | User Defined Source for custom data sources | +| [Source Transformer](sourcetransformer.md) | Transform data at ingestion | +| [Mapper](mapper.md) | Map UDF for transforming messages one at a time | +| [Map Streamer](mapstreamer.md) | MapStream UDF for streaming results as they're produced | +| [Batch Mapper](batchmapper.md) | BatchMap UDF for processing messages in batches | +| [Sinker](sinker.md) | User Defined Sink for custom data destinations | +| [Reducer](reducer.md) | Reduce UDF for aggregating messages by key and time window | +| [Reduce Streamer](reducestreamer.md) | Stream reduce results incrementally | +| [Accumulator](accumulator.md) | Accumulate and process data with state | +| [Side Input](sideinput.md) | Inject external data into UDFs | diff --git a/packages/pynumaflow/docs/api/mapper.md b/packages/pynumaflow/docs/api/mapper.md new file mode 100644 index 00000000..8508e62e --- /dev/null +++ b/packages/pynumaflow/docs/api/mapper.md @@ -0,0 +1,16 @@ +# Mapper + +The Mapper module provides classes and functions for implementing Map UDFs that transform messages one at a time. +Map is the most common UDF type. It receives one message at a time and can return: + +- One message (1:1 transformation) +- Multiple messages (fan-out) +- No messages (filter/drop) + +## Classes + +::: pynumaflow.mapper + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/mapstreamer.md b/packages/pynumaflow/docs/api/mapstreamer.md new file mode 100644 index 00000000..bfa42527 --- /dev/null +++ b/packages/pynumaflow/docs/api/mapstreamer.md @@ -0,0 +1,12 @@ +# Map Streamer + +The Map Streamer module provides classes and functions for implementing MapStream UDFs that stream results as they're produced. +Unlike regular Map which returns all messages at once, Map Stream yields messages one at a time as they're ready, reducing latency for downstream consumers. + +## Classes + +::: pynumaflow.mapstreamer + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/reducer.md b/packages/pynumaflow/docs/api/reducer.md new file mode 100644 index 00000000..7b9fce19 --- /dev/null +++ b/packages/pynumaflow/docs/api/reducer.md @@ -0,0 +1,12 @@ +# Reducer + +The Reducer module provides classes and functions for implementing Reduce UDFs that aggregate messages by key within time windows. +It's used for operations like counting, summing, or computing statistics over groups of messages. + +## Classes + +::: pynumaflow.reducer + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/reducestreamer.md b/packages/pynumaflow/docs/api/reducestreamer.md new file mode 100644 index 00000000..112b0448 --- /dev/null +++ b/packages/pynumaflow/docs/api/reducestreamer.md @@ -0,0 +1,12 @@ +# Reduce Streamer + +The Reduce Streamer module provides classes and functions for implementing ReduceStream UDFs that emit results incrementally during reduction. +Unlike regular Reduce which outputs only when the window closes, Reduce Stream emits results as they're computed. This is useful for early alerts or real-time dashboards. + +## Classes + +::: pynumaflow.reducestreamer + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/sideinput.md b/packages/pynumaflow/docs/api/sideinput.md new file mode 100644 index 00000000..b79ccf63 --- /dev/null +++ b/packages/pynumaflow/docs/api/sideinput.md @@ -0,0 +1,11 @@ +# Side Input + +Side Input allows you to inject external data into your UDFs. This is useful for configuration, lookup tables, or any data that UDFs need but isn't part of the main data stream. + +## Classes + +::: pynumaflow.sideinput + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/sinker.md b/packages/pynumaflow/docs/api/sinker.md new file mode 100644 index 00000000..87c769ce --- /dev/null +++ b/packages/pynumaflow/docs/api/sinker.md @@ -0,0 +1,11 @@ +# Sinker + +The Sinker module provides classes and functions for implementing User Defined Sinks that write processed data to external systems ((database, kafka topic, etc.)). + +## Classes + +::: pynumaflow.sinker + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/sourcer.md b/packages/pynumaflow/docs/api/sourcer.md new file mode 100644 index 00000000..0cd9a265 --- /dev/null +++ b/packages/pynumaflow/docs/api/sourcer.md @@ -0,0 +1,11 @@ +# Sourcer + +The Sourcer module provides classes and functions for implementing User Defined Sources that produce messages for Numaflow pipelines. + +## Classes + +::: pynumaflow.sourcer + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/api/sourcetransformer.md b/packages/pynumaflow/docs/api/sourcetransformer.md new file mode 100644 index 00000000..0df14a85 --- /dev/null +++ b/packages/pynumaflow/docs/api/sourcetransformer.md @@ -0,0 +1,18 @@ +# Source Transformer + +The Source Transformer module provides classes and functions for implementing Source Transform UDFs that transform data immediately after it's read from a source. +Source Transform is useful for: + +- Parsing/deserializing data at ingestion +- Filtering messages early +- Assigning event times +- Adding metadata +- Routing messages with tags + +## Classes + +::: pynumaflow.sourcetransformer + options: + show_root_heading: false + show_root_full_path: false + members_order: source diff --git a/packages/pynumaflow/docs/changelog.md b/packages/pynumaflow/docs/changelog.md new file mode 100644 index 00000000..b8bded4e --- /dev/null +++ b/packages/pynumaflow/docs/changelog.md @@ -0,0 +1,111 @@ +# Changelog + +All notable changes to pynumaflow will be documented in this page. + +The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), +and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). + +## [0.11.0] - Latest + +### Added + +- Accumulator functionality for stateful data accumulation +- ReduceStream for streaming reduce results +- Improved type hints throughout the codebase + +### Changed + +- Updated dependencies to latest versions +- Enhanced error handling in gRPC servers + +### Fixed + +- Various bug fixes and performance improvements + +--- + +## [0.10.0] + +### Added + +- BatchMap support for processing messages in batches +- MultiProcess server support for Map and SourceTransform +- Improved async server implementations + +### Changed + +- Refactored server architecture for better performance +- Updated protobuf definitions + +--- + +## [0.9.0] + +### Added + +- MapStream functionality +- Side Input support +- Enhanced metadata in Datum objects + +### Changed + +- Improved connection handling +- Better error messages + +--- + +## [0.8.0] + +### Added + +- User Defined Source support +- Source Transform functionality +- Headers support in messages + +### Changed + +- Updated gRPC communication protocol +- Enhanced logging + +--- + +## [0.7.0] + +### Added + +- Async server support for Map and Reduce +- User Defined Sink functionality +- Tagging support for message routing + +### Changed + +- Improved memory management +- Better type annotations + +--- + +## [0.6.0] + +### Added + +- Basic Map and Reduce UDF support +- gRPC server implementation +- Initial documentation + +--- + +## Upgrade Guide + +### Upgrading to 0.11.0 + +No breaking changes. New features are additive. + +### Upgrading to 0.10.0 + +If using custom server configurations, review the new server options. + +--- + +## Release Notes + +For detailed release notes, see the [GitHub Releases](https://github.com/numaproj/numaflow-python/releases) page. diff --git a/packages/pynumaflow/docs/contributing.md b/packages/pynumaflow/docs/contributing.md new file mode 100644 index 00000000..48953bf4 --- /dev/null +++ b/packages/pynumaflow/docs/contributing.md @@ -0,0 +1,232 @@ +# Contributing + +Thank you for your interest in contributing to pynumaflow! This guide will help you get started. + +## Development Setup + +### Prerequisites + +- Python 3.9 or higher +- [Poetry](https://python-poetry.org/) for dependency management +- Git + +### Clone and Install + +```bash +# Clone the repository +git clone https://github.com/numaproj/numaflow-python.git +cd numaflow-python/packages/pynumaflow + +# Install dependencies +make setup +``` + +This will install all development dependencies including testing and linting tools. + +### Verify Installation + +```bash +# Run tests +make test + +# Run linting +make lint +``` + +## Code Style + +We use [Black](https://black.readthedocs.io/) for code formatting and [Ruff](https://github.com/astral-sh/ruff) for linting. + +### Format Code + +```bash +make format +``` + +### Lint Code + +```bash +make lint +``` + +### Pre-commit Hooks + +We recommend setting up pre-commit hooks to automatically format and lint code before commits: + +```bash +pre-commit install +``` + +## Testing + +### Run All Tests + +```bash +make test +``` + +### Run Specific Tests + +```bash +poetry run pytest tests/test_mapper.py -v +``` + +### Test Coverage + +```bash +poetry run pytest tests/ --cov=pynumaflow --cov-report=html +``` + +## Project Structure + +``` +packages/pynumaflow/ +├── pynumaflow/ # Main package +│ ├── mapper/ # Map UDF implementation +│ ├── reducer/ # Reduce UDF implementation +│ ├── mapstreamer/ # MapStream implementation +│ ├── batchmapper/ # BatchMap implementation +│ ├── sourcer/ # Source implementation +│ ├── sinker/ # Sink implementation +│ ├── sourcetransformer/# SourceTransform implementation +│ ├── sideinput/ # SideInput implementation +│ ├── reducestreamer/ # ReduceStream implementation +│ ├── accumulator/ # Accumulator implementation +│ ├── proto/ # Protocol buffer definitions +│ ├── shared/ # Shared utilities +│ └── types.py # Common type definitions +├── tests/ # Test files +├── examples/ # Example implementations +├── docs/ # Documentation source +├── pyproject.toml # Project configuration +└── Makefile # Development commands +``` + +## Making Changes + +### 1. Create a Branch + +```bash +git checkout -b feature/your-feature-name +``` + +### 2. Make Your Changes + +- Write clear, documented code +- Follow the existing code style +- Add tests for new functionality +- Update documentation as needed + +### 3. Test Your Changes + +```bash +make test +make lint +``` + +### 4. Commit Your Changes + +Write clear commit messages: + +```bash +git commit -m "Add feature: description of your change" +``` + +### 5. Push and Create PR + +```bash +git push origin feature/your-feature-name +``` + +Then create a Pull Request on GitHub. + +## Protocol Buffers + +If you need to modify protocol buffer definitions: + +1. Edit the `.proto` files in `pynumaflow/proto/` +2. Regenerate Python files: + +```bash +make proto +``` + +## Documentation + +### Local Preview + +```bash +# Install docs dependencies +poetry install --with docs + +# Serve locally +make docs-serve +``` + +Visit `http://localhost:8000` to preview. + +### Writing Documentation + +- Documentation source files are in `docs/` +- Use Markdown with MkDocs extensions +- Include code examples +- Keep explanations clear and concise + +## Pull Request Guidelines + +### Before Submitting + +- [ ] Tests pass (`make test`) +- [ ] Code is formatted (`make lint`) +- [ ] Documentation is updated (if applicable) +- [ ] Commit messages are clear + +### PR Description + +Include: + +- What the change does +- Why the change is needed +- How to test the change +- Any breaking changes + +### Review Process + +1. A maintainer will review your PR +2. Address any feedback +3. Once approved, your PR will be merged + +## Reporting Issues + +### Bug Reports + +Include: + +- Python version +- pynumaflow version +- Steps to reproduce +- Expected behavior +- Actual behavior +- Error messages (if any) + +### Feature Requests + +Include: + +- Use case description +- Proposed solution +- Alternatives considered + +## Code of Conduct + +Please be respectful and constructive in all interactions. We're all here to build great software together. + +## Getting Help + +- [GitHub Issues](https://github.com/numaproj/numaflow-python/issues) +- [Numaflow Slack](https://numaproj.slack.com) +- [Numaflow Documentation](https://numaflow.numaproj.io/) + +## License + +By contributing, you agree that your contributions will be licensed under the Apache 2.0 License. diff --git a/packages/pynumaflow/docs/index.md b/packages/pynumaflow/docs/index.md new file mode 120000 index 00000000..32d46ee8 --- /dev/null +++ b/packages/pynumaflow/docs/index.md @@ -0,0 +1 @@ +../README.md \ No newline at end of file diff --git a/packages/pynumaflow/mkdocs.yml b/packages/pynumaflow/mkdocs.yml new file mode 100644 index 00000000..b326e483 --- /dev/null +++ b/packages/pynumaflow/mkdocs.yml @@ -0,0 +1,131 @@ +site_name: pynumaflow +site_description: Python SDK for Numaflow - Build real-time data processing pipelines +site_url: https://numaproj.github.io/numaflow-python/ +strict: false # Set to false to allow build with minor docstring warnings + +theme: + name: material + palette: + - media: "(prefers-color-scheme: light)" + scheme: default + primary: indigo + accent: indigo + toggle: + icon: material/brightness-7 + name: Switch to dark mode + - media: "(prefers-color-scheme: dark)" + scheme: slate + primary: indigo + accent: indigo + toggle: + icon: material/brightness-4 + name: Switch to light mode + features: + - content.code.copy + - content.code.annotate + - content.tabs.link + - navigation.tabs + - navigation.tabs.sticky + - navigation.sections + - navigation.expand + - navigation.top + - navigation.tracking + - navigation.indexes + - search.suggest + - search.highlight + - toc.follow + icon: + repo: fontawesome/brands/github + +repo_name: numaproj/numaflow-python +repo_url: https://github.com/numaproj/numaflow-python +edit_uri: edit/main/packages/pynumaflow/docs/ + +extra: + version: + provider: mike + default: latest + social: + - icon: fontawesome/brands/github + link: https://github.com/numaproj/numaflow-python + - icon: fontawesome/brands/slack + link: https://numaproj.slack.com + +nav: + - Home: index.md + - API Reference: + - api/index.md + - Sourcer: api/sourcer.md + - Source Transformer: api/sourcetransformer.md + - Mapper: api/mapper.md + - Map Streamer: api/mapstreamer.md + - Batch Mapper: api/batchmapper.md + - Sinker: api/sinker.md + - Reducer: api/reducer.md + - Reduce Streamer: api/reducestreamer.md + - Accumulator: api/accumulator.md + - Side Input: api/sideinput.md + +markdown_extensions: + - tables + - toc: + permalink: true + - admonition + - pymdownx.details + - pymdownx.superfences: + custom_fences: + - name: mermaid + class: mermaid + format: !!python/name:pymdownx.superfences.fence_code_format + - pymdownx.highlight: + anchor_linenums: true + line_spans: __span + pygments_lang_class: true + - pymdownx.inlinehilite + - pymdownx.snippets + - pymdownx.tabbed: + alternate_style: true + - attr_list + - md_in_html + - pymdownx.emoji: + emoji_index: !!python/name:material.extensions.emoji.twemoji + emoji_generator: !!python/name:material.extensions.emoji.to_svg + +plugins: + - search + - mike: + alias_type: symlink + canonical_version: latest + - mkdocstrings: + default_handler: python + handlers: + python: + paths: [.] + import: + - https://docs.python.org/3/objects.inv + - https://grpc.github.io/grpc/python/objects.inv + options: + members_order: source + separate_signature: true + show_signature_annotations: true + signature_crossrefs: true + docstring_style: google + docstring_options: + ignore_init_summary: true + returns_named_value: false + trim_doctest_flags: true + docstring_section_style: table + merge_init_into_class: true + show_root_heading: true + show_root_full_path: false + show_symbol_type_heading: true + show_symbol_type_toc: true + filters: + - "!^_" # Exclude private/dunder members (starting with _) + - exclude: + glob: + - "*.tmp" + - "*.bak" + +watch: + - pynumaflow diff --git a/packages/pynumaflow/poetry.lock b/packages/pynumaflow/poetry.lock index a58a57eb..5cf7fb15 100644 --- a/packages/pynumaflow/poetry.lock +++ b/packages/pynumaflow/poetry.lock @@ -15,6 +15,41 @@ files = [ [package.extras] dev = ["pytest", "pytest-cov"] +[[package]] +name = "babel" +version = "2.17.0" +description = "Internationalization utilities" +optional = false +python-versions = ">=3.8" +groups = ["docs"] +files = [ + {file = "babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2"}, + {file = "babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d"}, +] + +[package.extras] +dev = ["backports.zoneinfo ; python_version < \"3.9\"", "freezegun (>=1.0,<2.0)", "jinja2 (>=3.0)", "pytest (>=6.0)", "pytest-cov", "pytz", "setuptools", "tzdata ; sys_platform == \"win32\""] + +[[package]] +name = "backrefs" +version = "6.1" +description = "A wrapper around re and regex that adds additional back references." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "backrefs-6.1-py310-none-any.whl", hash = "sha256:2a2ccb96302337ce61ee4717ceacfbf26ba4efb1d55af86564b8bbaeda39cac1"}, + {file = "backrefs-6.1-py311-none-any.whl", hash = "sha256:e82bba3875ee4430f4de4b6db19429a27275d95a5f3773c57e9e18abc23fd2b7"}, + {file = "backrefs-6.1-py312-none-any.whl", hash = "sha256:c64698c8d2269343d88947c0735cb4b78745bd3ba590e10313fbf3f78c34da5a"}, + {file = "backrefs-6.1-py313-none-any.whl", hash = "sha256:4c9d3dc1e2e558965202c012304f33d4e0e477e1c103663fd2c3cc9bb18b0d05"}, + {file = "backrefs-6.1-py314-none-any.whl", hash = "sha256:13eafbc9ccd5222e9c1f0bec563e6d2a6d21514962f11e7fc79872fd56cbc853"}, + {file = "backrefs-6.1-py39-none-any.whl", hash = "sha256:a9e99b8a4867852cad177a6430e31b0f6e495d65f8c6c134b68c14c3c95bf4b0"}, + {file = "backrefs-6.1.tar.gz", hash = "sha256:3bba1749aafe1db9b915f00e0dd166cba613b6f788ffd63060ac3485dc9be231"}, +] + +[package.extras] +extras = ["regex"] + [[package]] name = "black" version = "23.12.1" @@ -74,18 +109,158 @@ files = [ {file = "cachetools-5.5.1.tar.gz", hash = "sha256:70f238fbba50383ef62e55c6aff6d9673175fe59f7c6782c7a0b9e38f4a9df95"}, ] +[[package]] +name = "cairocffi" +version = "1.7.1" +description = "cffi-based cairo bindings for Python" +optional = false +python-versions = ">=3.8" +groups = ["docs"] +files = [ + {file = "cairocffi-1.7.1-py3-none-any.whl", hash = "sha256:9803a0e11f6c962f3b0ae2ec8ba6ae45e957a146a004697a1ac1bbf16b073b3f"}, + {file = "cairocffi-1.7.1.tar.gz", hash = "sha256:2e48ee864884ec4a3a34bfa8c9ab9999f688286eb714a15a43ec9d068c36557b"}, +] + +[package.dependencies] +cffi = ">=1.1.0" + +[package.extras] +doc = ["sphinx", "sphinx_rtd_theme"] +test = ["numpy", "pikepdf", "pytest", "ruff"] +xcb = ["xcffib (>=1.4.0)"] + +[[package]] +name = "cairosvg" +version = "2.8.2" +description = "A Simple SVG Converter based on Cairo" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "cairosvg-2.8.2-py3-none-any.whl", hash = "sha256:eab46dad4674f33267a671dce39b64be245911c901c70d65d2b7b0821e852bf5"}, + {file = "cairosvg-2.8.2.tar.gz", hash = "sha256:07cbf4e86317b27a92318a4cac2a4bb37a5e9c1b8a27355d06874b22f85bef9f"}, +] + +[package.dependencies] +cairocffi = "*" +cssselect2 = "*" +defusedxml = "*" +pillow = "*" +tinycss2 = "*" + +[package.extras] +doc = ["sphinx", "sphinx_rtd_theme"] +test = ["flake8", "isort", "pytest"] + [[package]] name = "certifi" version = "2024.12.14" description = "Python package for providing Mozilla's CA Bundle." optional = false python-versions = ">=3.6" -groups = ["main"] +groups = ["main", "docs"] files = [ {file = "certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56"}, {file = "certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db"}, ] +[[package]] +name = "cffi" +version = "2.0.0" +description = "Foreign Function Interface for Python calling C code." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44"}, + {file = "cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453"}, + {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495"}, + {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5"}, + {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb"}, + {file = "cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a"}, + {file = "cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739"}, + {file = "cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe"}, + {file = "cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26"}, + {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9"}, + {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414"}, + {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743"}, + {file = "cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5"}, + {file = "cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5"}, + {file = "cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d"}, + {file = "cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d"}, + {file = "cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba"}, + {file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94"}, + {file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187"}, + {file = "cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18"}, + {file = "cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5"}, + {file = "cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6"}, + {file = "cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb"}, + {file = "cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26"}, + {file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c"}, + {file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b"}, + {file = "cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27"}, + {file = "cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75"}, + {file = "cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91"}, + {file = "cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5"}, + {file = "cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775"}, + {file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205"}, + {file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1"}, + {file = "cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f"}, + {file = "cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25"}, + {file = "cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad"}, + {file = "cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9"}, + {file = "cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592"}, + {file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512"}, + {file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4"}, + {file = "cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e"}, + {file = "cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6"}, + {file = "cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9"}, + {file = "cffi-2.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf"}, + {file = "cffi-2.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63"}, + {file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2"}, + {file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65"}, + {file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322"}, + {file = "cffi-2.0.0-cp39-cp39-win32.whl", hash = "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a"}, + {file = "cffi-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9"}, + {file = "cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529"}, +] + +[package.dependencies] +pycparser = {version = "*", markers = "implementation_name != \"PyPy\""} + [[package]] name = "cfgv" version = "3.4.0" @@ -104,7 +279,7 @@ version = "3.4.1" description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." optional = false python-versions = ">=3.7" -groups = ["main"] +groups = ["main", "docs"] files = [ {file = "charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de"}, {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176"}, @@ -206,7 +381,7 @@ version = "8.1.8" description = "Composable command line interface toolkit" optional = false python-versions = ">=3.7" -groups = ["dev"] +groups = ["dev", "docs"] files = [ {file = "click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2"}, {file = "click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a"}, @@ -221,12 +396,12 @@ version = "0.4.6" description = "Cross-platform colored terminal text." optional = false python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7" -groups = ["dev"] -markers = "sys_platform == \"win32\" or platform_system == \"Windows\"" +groups = ["dev", "docs"] files = [ {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"}, {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"}, ] +markers = {dev = "sys_platform == \"win32\" or platform_system == \"Windows\""} [[package]] name = "coverage" @@ -306,6 +481,38 @@ tomli = {version = "*", optional = true, markers = "python_full_version <= \"3.1 [package.extras] toml = ["tomli ; python_full_version <= \"3.11.0a6\""] +[[package]] +name = "cssselect2" +version = "0.8.0" +description = "CSS selectors for Python ElementTree" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "cssselect2-0.8.0-py3-none-any.whl", hash = "sha256:46fc70ebc41ced7a32cd42d58b1884d72ade23d21e5a4eaaf022401c13f0e76e"}, + {file = "cssselect2-0.8.0.tar.gz", hash = "sha256:7674ffb954a3b46162392aee2a3a0aedb2e14ecf99fcc28644900f4e6e3e9d3a"}, +] + +[package.dependencies] +tinycss2 = "*" +webencodings = "*" + +[package.extras] +doc = ["furo", "sphinx"] +test = ["pytest", "ruff"] + +[[package]] +name = "defusedxml" +version = "0.7.1" +description = "XML bomb protection for Python stdlib modules" +optional = false +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" +groups = ["docs"] +files = [ + {file = "defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61"}, + {file = "defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69"}, +] + [[package]] name = "distlib" version = "0.3.9" @@ -351,6 +558,24 @@ docs = ["furo (>=2024.8.6)", "sphinx (>=8.1.3)", "sphinx-autodoc-typehints (>=3) testing = ["covdefaults (>=2.3)", "coverage (>=7.6.10)", "diff-cover (>=9.2.1)", "pytest (>=8.3.4)", "pytest-asyncio (>=0.25.2)", "pytest-cov (>=6)", "pytest-mock (>=3.14)", "pytest-timeout (>=2.3.1)", "virtualenv (>=20.28.1)"] typing = ["typing-extensions (>=4.12.2) ; python_version < \"3.11\""] +[[package]] +name = "ghp-import" +version = "2.1.0" +description = "Copy your docs directly to the gh-pages branch." +optional = false +python-versions = "*" +groups = ["docs"] +files = [ + {file = "ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343"}, + {file = "ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619"}, +] + +[package.dependencies] +python-dateutil = ">=2.8.1" + +[package.extras] +dev = ["flake8", "markdown", "twine", "wheel"] + [[package]] name = "google-api-core" version = "2.25.1" @@ -422,6 +647,41 @@ protobuf = ">=3.20.2,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4 [package.extras] grpc = ["grpcio (>=1.44.0,<2.0.0)"] +[[package]] +name = "griffe" +version = "1.14.0" +description = "Signatures for entire Python programs. Extract the structure, the frame, the skeleton of your project, to generate API documentation or find breaking changes in your API." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +markers = "python_version < \"3.13\"" +files = [ + {file = "griffe-1.14.0-py3-none-any.whl", hash = "sha256:0e9d52832cccf0f7188cfe585ba962d2674b241c01916d780925df34873bceb0"}, + {file = "griffe-1.14.0.tar.gz", hash = "sha256:9d2a15c1eca966d68e00517de5d69dd1bc5c9f2335ef6c1775362ba5b8651a13"}, +] + +[package.dependencies] +colorama = ">=0.4" + +[[package]] +name = "griffe" +version = "1.15.0" +description = "Signatures for entire Python programs. Extract the structure, the frame, the skeleton of your project, to generate API documentation or find breaking changes in your API." +optional = false +python-versions = ">=3.10" +groups = ["docs"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "griffe-1.15.0-py3-none-any.whl", hash = "sha256:6f6762661949411031f5fcda9593f586e6ce8340f0ba88921a0f2ef7a81eb9a3"}, + {file = "griffe-1.15.0.tar.gz", hash = "sha256:7726e3afd6f298fbc3696e67958803e7ac843c1cfe59734b6251a40cdbfb5eea"}, +] + +[package.dependencies] +colorama = ">=0.4" + +[package.extras] +pypi = ["pip (>=24.0)", "platformdirs (>=4.2)", "wheel (>=0.42)"] + [[package]] name = "grpc-stubs" version = "1.53.0.6" @@ -624,7 +884,7 @@ version = "3.10" description = "Internationalized Domain Names in Applications (IDNA)" optional = false python-versions = ">=3.6" -groups = ["main"] +groups = ["main", "docs"] files = [ {file = "idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"}, {file = "idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9"}, @@ -633,6 +893,53 @@ files = [ [package.extras] all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"] +[[package]] +name = "importlib-metadata" +version = "8.7.1" +description = "Read metadata from Python packages" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "importlib_metadata-8.7.1-py3-none-any.whl", hash = "sha256:5a1f80bf1daa489495071efbb095d75a634cf28a8bc299581244063b53176151"}, + {file = "importlib_metadata-8.7.1.tar.gz", hash = "sha256:49fef1ae6440c182052f407c8d34a68f72efc36db9ca90dc0113398f2fdde8bb"}, +] + +[package.dependencies] +zipp = ">=3.20" + +[package.extras] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1) ; sys_platform != \"cygwin\""] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +enabler = ["pytest-enabler (>=3.4)"] +perf = ["ipython"] +test = ["flufl.flake8", "jaraco.test (>=5.4)", "packaging", "pyfakefs", "pytest (>=6,!=8.1.*)", "pytest-perf (>=0.9.2)"] +type = ["mypy (<1.19) ; platform_python_implementation == \"PyPy\"", "pytest-mypy (>=1.0.1)"] + +[[package]] +name = "importlib-resources" +version = "6.5.2" +description = "Read resources from Python packages" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "importlib_resources-6.5.2-py3-none-any.whl", hash = "sha256:789cfdc3ed28c78b67a06acb8126751ced69a3d5f79c095a98298cd8a760ccec"}, + {file = "importlib_resources-6.5.2.tar.gz", hash = "sha256:185f87adef5bcc288449d98fb4fba07cea78bc036455dd44c5fc4a2fe78fed2c"}, +] + +[package.dependencies] +zipp = {version = ">=3.1.0", markers = "python_version < \"3.10\""} + +[package.extras] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1) ; sys_platform != \"cygwin\""] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +enabler = ["pytest-enabler (>=2.2)"] +test = ["jaraco.test (>=5.4)", "pytest (>=6,!=8.1.*)", "zipp (>=3.17)"] +type = ["pytest-mypy"] + [[package]] name = "iniconfig" version = "2.0.0" @@ -645,6 +952,370 @@ files = [ {file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"}, ] +[[package]] +name = "jinja2" +version = "3.1.6" +description = "A very fast and expressive template engine." +optional = false +python-versions = ">=3.7" +groups = ["docs"] +files = [ + {file = "jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67"}, + {file = "jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d"}, +] + +[package.dependencies] +MarkupSafe = ">=2.0" + +[package.extras] +i18n = ["Babel (>=2.7)"] + +[[package]] +name = "markdown" +version = "3.9" +description = "Python implementation of John Gruber's Markdown." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +markers = "python_version < \"3.13\"" +files = [ + {file = "markdown-3.9-py3-none-any.whl", hash = "sha256:9f4d91ed810864ea88a6f32c07ba8bee1346c0cc1f6b1f9f6c822f2a9667d280"}, + {file = "markdown-3.9.tar.gz", hash = "sha256:d2900fe1782bd33bdbbd56859defef70c2e78fc46668f8eb9df3128138f2cb6a"}, +] + +[package.dependencies] +importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""} + +[package.extras] +docs = ["mdx_gh_links (>=0.2)", "mkdocs (>=1.6)", "mkdocs-gen-files", "mkdocs-literate-nav", "mkdocs-nature (>=0.6)", "mkdocs-section-index", "mkdocstrings[python]"] +testing = ["coverage", "pyyaml"] + +[[package]] +name = "markdown" +version = "3.10" +description = "Python implementation of John Gruber's Markdown." +optional = false +python-versions = ">=3.10" +groups = ["docs"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "markdown-3.10-py3-none-any.whl", hash = "sha256:b5b99d6951e2e4948d939255596523444c0e677c669700b1d17aa4a8a464cb7c"}, + {file = "markdown-3.10.tar.gz", hash = "sha256:37062d4f2aa4b2b6b32aefb80faa300f82cc790cb949a35b8caede34f2b68c0e"}, +] + +[package.extras] +docs = ["mdx_gh_links (>=0.2)", "mkdocs (>=1.6)", "mkdocs-gen-files", "mkdocs-literate-nav", "mkdocs-nature (>=0.6)", "mkdocs-section-index", "mkdocstrings[python]"] +testing = ["coverage", "pyyaml"] + +[[package]] +name = "markupsafe" +version = "3.0.3" +description = "Safely add untrusted strings to HTML/XML markup." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "markupsafe-3.0.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2f981d352f04553a7171b8e44369f2af4055f888dfb147d55e42d29e29e74559"}, + {file = "markupsafe-3.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e1c1493fb6e50ab01d20a22826e57520f1284df32f2d8601fdd90b6304601419"}, + {file = "markupsafe-3.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1ba88449deb3de88bd40044603fafffb7bc2b055d626a330323a9ed736661695"}, + {file = "markupsafe-3.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f42d0984e947b8adf7dd6dde396e720934d12c506ce84eea8476409563607591"}, + {file = "markupsafe-3.0.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c0c0b3ade1c0b13b936d7970b1d37a57acde9199dc2aecc4c336773e1d86049c"}, + {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0303439a41979d9e74d18ff5e2dd8c43ed6c6001fd40e5bf2e43f7bd9bbc523f"}, + {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:d2ee202e79d8ed691ceebae8e0486bd9a2cd4794cec4824e1c99b6f5009502f6"}, + {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:177b5253b2834fe3678cb4a5f0059808258584c559193998be2601324fdeafb1"}, + {file = "markupsafe-3.0.3-cp310-cp310-win32.whl", hash = "sha256:2a15a08b17dd94c53a1da0438822d70ebcd13f8c3a95abe3a9ef9f11a94830aa"}, + {file = "markupsafe-3.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:c4ffb7ebf07cfe8931028e3e4c85f0357459a3f9f9490886198848f4fa002ec8"}, + {file = "markupsafe-3.0.3-cp310-cp310-win_arm64.whl", hash = "sha256:e2103a929dfa2fcaf9bb4e7c091983a49c9ac3b19c9061b6d5427dd7d14d81a1"}, + {file = "markupsafe-3.0.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cc7ea17a6824959616c525620e387f6dd30fec8cb44f649e31712db02123dad"}, + {file = "markupsafe-3.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4bd4cd07944443f5a265608cc6aab442e4f74dff8088b0dfc8238647b8f6ae9a"}, + {file = "markupsafe-3.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b5420a1d9450023228968e7e6a9ce57f65d148ab56d2313fcd589eee96a7a50"}, + {file = "markupsafe-3.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0bf2a864d67e76e5c9a34dc26ec616a66b9888e25e7b9460e1c76d3293bd9dbf"}, + {file = "markupsafe-3.0.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc51efed119bc9cfdf792cdeaa4d67e8f6fcccab66ed4bfdd6bde3e59bfcbb2f"}, + {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a"}, + {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7be7b61bb172e1ed687f1754f8e7484f1c8019780f6f6b0786e76bb01c2ae115"}, + {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f9e130248f4462aaa8e2552d547f36ddadbeaa573879158d721bbd33dfe4743a"}, + {file = "markupsafe-3.0.3-cp311-cp311-win32.whl", hash = "sha256:0db14f5dafddbb6d9208827849fad01f1a2609380add406671a26386cdf15a19"}, + {file = "markupsafe-3.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:de8a88e63464af587c950061a5e6a67d3632e36df62b986892331d4620a35c01"}, + {file = "markupsafe-3.0.3-cp311-cp311-win_arm64.whl", hash = "sha256:3b562dd9e9ea93f13d53989d23a7e775fdfd1066c33494ff43f5418bc8c58a5c"}, + {file = "markupsafe-3.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e"}, + {file = "markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce"}, + {file = "markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d"}, + {file = "markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d"}, + {file = "markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a"}, + {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b"}, + {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f"}, + {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b"}, + {file = "markupsafe-3.0.3-cp312-cp312-win32.whl", hash = "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d"}, + {file = "markupsafe-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c"}, + {file = "markupsafe-3.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f"}, + {file = "markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795"}, + {file = "markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219"}, + {file = "markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6"}, + {file = "markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676"}, + {file = "markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9"}, + {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1"}, + {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc"}, + {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12"}, + {file = "markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed"}, + {file = "markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5"}, + {file = "markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485"}, + {file = "markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73"}, + {file = "markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37"}, + {file = "markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19"}, + {file = "markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025"}, + {file = "markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6"}, + {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f"}, + {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb"}, + {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009"}, + {file = "markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354"}, + {file = "markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218"}, + {file = "markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287"}, + {file = "markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe"}, + {file = "markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026"}, + {file = "markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737"}, + {file = "markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97"}, + {file = "markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d"}, + {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda"}, + {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf"}, + {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe"}, + {file = "markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9"}, + {file = "markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581"}, + {file = "markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4"}, + {file = "markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab"}, + {file = "markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175"}, + {file = "markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634"}, + {file = "markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50"}, + {file = "markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e"}, + {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5"}, + {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523"}, + {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc"}, + {file = "markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d"}, + {file = "markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9"}, + {file = "markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa"}, + {file = "markupsafe-3.0.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:15d939a21d546304880945ca1ecb8a039db6b4dc49b2c5a400387cdae6a62e26"}, + {file = "markupsafe-3.0.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f71a396b3bf33ecaa1626c255855702aca4d3d9fea5e051b41ac59a9c1c41edc"}, + {file = "markupsafe-3.0.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0f4b68347f8c5eab4a13419215bdfd7f8c9b19f2b25520968adfad23eb0ce60c"}, + {file = "markupsafe-3.0.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e8fc20152abba6b83724d7ff268c249fa196d8259ff481f3b1476383f8f24e42"}, + {file = "markupsafe-3.0.3-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:949b8d66bc381ee8b007cd945914c721d9aba8e27f71959d750a46f7c282b20b"}, + {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:3537e01efc9d4dccdf77221fb1cb3b8e1a38d5428920e0657ce299b20324d758"}, + {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:591ae9f2a647529ca990bc681daebdd52c8791ff06c2bfa05b65163e28102ef2"}, + {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:a320721ab5a1aba0a233739394eb907f8c8da5c98c9181d1161e77a0c8e36f2d"}, + {file = "markupsafe-3.0.3-cp39-cp39-win32.whl", hash = "sha256:df2449253ef108a379b8b5d6b43f4b1a8e81a061d6537becd5582fba5f9196d7"}, + {file = "markupsafe-3.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:7c3fb7d25180895632e5d3148dbdc29ea38ccb7fd210aa27acbd1201a1902c6e"}, + {file = "markupsafe-3.0.3-cp39-cp39-win_arm64.whl", hash = "sha256:38664109c14ffc9e7437e86b4dceb442b0096dfe3541d7864d9cbe1da4cf36c8"}, + {file = "markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698"}, +] + +[[package]] +name = "mergedeep" +version = "1.3.4" +description = "A deep merge function for 🐍." +optional = false +python-versions = ">=3.6" +groups = ["docs"] +files = [ + {file = "mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307"}, + {file = "mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8"}, +] + +[[package]] +name = "mike" +version = "2.1.3" +description = "Manage multiple versions of your MkDocs-powered documentation" +optional = false +python-versions = "*" +groups = ["docs"] +files = [ + {file = "mike-2.1.3-py3-none-any.whl", hash = "sha256:d90c64077e84f06272437b464735130d380703a76a5738b152932884c60c062a"}, + {file = "mike-2.1.3.tar.gz", hash = "sha256:abd79b8ea483fb0275b7972825d3082e5ae67a41820f8d8a0dc7a3f49944e810"}, +] + +[package.dependencies] +importlib-metadata = "*" +importlib-resources = "*" +jinja2 = ">=2.7" +mkdocs = ">=1.0" +pyparsing = ">=3.0" +pyyaml = ">=5.1" +pyyaml-env-tag = "*" +verspec = "*" + +[package.extras] +dev = ["coverage", "flake8 (>=3.0)", "flake8-quotes", "shtab"] +test = ["coverage", "flake8 (>=3.0)", "flake8-quotes", "shtab"] + +[[package]] +name = "mkdocs" +version = "1.6.1" +description = "Project documentation with Markdown." +optional = false +python-versions = ">=3.8" +groups = ["docs"] +files = [ + {file = "mkdocs-1.6.1-py3-none-any.whl", hash = "sha256:db91759624d1647f3f34aa0c3f327dd2601beae39a366d6e064c03468d35c20e"}, + {file = "mkdocs-1.6.1.tar.gz", hash = "sha256:7b432f01d928c084353ab39c57282f29f92136665bdd6abf7c1ec8d822ef86f2"}, +] + +[package.dependencies] +click = ">=7.0" +colorama = {version = ">=0.4", markers = "platform_system == \"Windows\""} +ghp-import = ">=1.0" +importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""} +jinja2 = ">=2.11.1" +markdown = ">=3.3.6" +markupsafe = ">=2.0.1" +mergedeep = ">=1.3.4" +mkdocs-get-deps = ">=0.2.0" +packaging = ">=20.5" +pathspec = ">=0.11.1" +pyyaml = ">=5.1" +pyyaml-env-tag = ">=0.1" +watchdog = ">=2.0" + +[package.extras] +i18n = ["babel (>=2.9.0)"] +min-versions = ["babel (==2.9.0)", "click (==7.0)", "colorama (==0.4) ; platform_system == \"Windows\"", "ghp-import (==1.0)", "importlib-metadata (==4.4) ; python_version < \"3.10\"", "jinja2 (==2.11.1)", "markdown (==3.3.6)", "markupsafe (==2.0.1)", "mergedeep (==1.3.4)", "mkdocs-get-deps (==0.2.0)", "packaging (==20.5)", "pathspec (==0.11.1)", "pyyaml (==5.1)", "pyyaml-env-tag (==0.1)", "watchdog (==2.0)"] + +[[package]] +name = "mkdocs-autorefs" +version = "1.4.3" +description = "Automatically link across pages in MkDocs." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "mkdocs_autorefs-1.4.3-py3-none-any.whl", hash = "sha256:469d85eb3114801d08e9cc55d102b3ba65917a869b893403b8987b601cf55dc9"}, + {file = "mkdocs_autorefs-1.4.3.tar.gz", hash = "sha256:beee715b254455c4aa93b6ef3c67579c399ca092259cc41b7d9342573ff1fc75"}, +] + +[package.dependencies] +Markdown = ">=3.3" +markupsafe = ">=2.0.1" +mkdocs = ">=1.1" + +[[package]] +name = "mkdocs-exclude" +version = "1.0.2" +description = "A mkdocs plugin that lets you exclude files or trees." +optional = false +python-versions = "*" +groups = ["docs"] +files = [ + {file = "mkdocs-exclude-1.0.2.tar.gz", hash = "sha256:ba6fab3c80ddbe3fd31d3e579861fd3124513708271180a5f81846da8c7e2a51"}, +] + +[package.dependencies] +mkdocs = "*" + +[[package]] +name = "mkdocs-get-deps" +version = "0.2.0" +description = "MkDocs extension that lists all dependencies according to a mkdocs.yml file" +optional = false +python-versions = ">=3.8" +groups = ["docs"] +files = [ + {file = "mkdocs_get_deps-0.2.0-py3-none-any.whl", hash = "sha256:2bf11d0b133e77a0dd036abeeb06dec8775e46efa526dc70667d8863eefc6134"}, + {file = "mkdocs_get_deps-0.2.0.tar.gz", hash = "sha256:162b3d129c7fad9b19abfdcb9c1458a651628e4b1dea628ac68790fb3061c60c"}, +] + +[package.dependencies] +importlib-metadata = {version = ">=4.3", markers = "python_version < \"3.10\""} +mergedeep = ">=1.3.4" +platformdirs = ">=2.2.0" +pyyaml = ">=5.1" + +[[package]] +name = "mkdocs-material" +version = "9.7.1" +description = "Documentation that simply works" +optional = false +python-versions = ">=3.8" +groups = ["docs"] +files = [ + {file = "mkdocs_material-9.7.1-py3-none-any.whl", hash = "sha256:3f6100937d7d731f87f1e3e3b021c97f7239666b9ba1151ab476cabb96c60d5c"}, + {file = "mkdocs_material-9.7.1.tar.gz", hash = "sha256:89601b8f2c3e6c6ee0a918cc3566cb201d40bf37c3cd3c2067e26fadb8cce2b8"}, +] + +[package.dependencies] +babel = ">=2.10" +backrefs = ">=5.7.post1" +cairosvg = {version = ">=2.6,<3.0", optional = true, markers = "extra == \"imaging\""} +colorama = ">=0.4" +jinja2 = ">=3.1" +markdown = ">=3.2" +mkdocs = ">=1.6" +mkdocs-material-extensions = ">=1.3" +paginate = ">=0.5" +pillow = {version = ">=10.2,<12.0", optional = true, markers = "extra == \"imaging\""} +pygments = ">=2.16" +pymdown-extensions = ">=10.2" +requests = ">=2.30" + +[package.extras] +git = ["mkdocs-git-committers-plugin-2 (>=1.1,<3)", "mkdocs-git-revision-date-localized-plugin (>=1.2.4,<2.0)"] +imaging = ["cairosvg (>=2.6,<3.0)", "pillow (>=10.2,<12.0)"] +recommended = ["mkdocs-minify-plugin (>=0.7,<1.0)", "mkdocs-redirects (>=1.2,<2.0)", "mkdocs-rss-plugin (>=1.6,<2.0)"] + +[[package]] +name = "mkdocs-material-extensions" +version = "1.3.1" +description = "Extension pack for Python Markdown and MkDocs Material." +optional = false +python-versions = ">=3.8" +groups = ["docs"] +files = [ + {file = "mkdocs_material_extensions-1.3.1-py3-none-any.whl", hash = "sha256:adff8b62700b25cb77b53358dad940f3ef973dd6db797907c49e3c2ef3ab4e31"}, + {file = "mkdocs_material_extensions-1.3.1.tar.gz", hash = "sha256:10c9511cea88f568257f960358a467d12b970e1f7b2c0e5fb2bb48cab1928443"}, +] + +[[package]] +name = "mkdocstrings" +version = "0.27.0" +description = "Automatic documentation from sources, for MkDocs." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "mkdocstrings-0.27.0-py3-none-any.whl", hash = "sha256:6ceaa7ea830770959b55a16203ac63da24badd71325b96af950e59fd37366332"}, + {file = "mkdocstrings-0.27.0.tar.gz", hash = "sha256:16adca6d6b0a1f9e0c07ff0b02ced8e16f228a9d65a37c063ec4c14d7b76a657"}, +] + +[package.dependencies] +click = ">=7.0" +importlib-metadata = {version = ">=4.6", markers = "python_version < \"3.10\""} +Jinja2 = ">=2.11.1" +Markdown = ">=3.6" +MarkupSafe = ">=1.1" +mkdocs = ">=1.4" +mkdocs-autorefs = ">=1.2" +mkdocstrings-python = {version = ">=0.5.2", optional = true, markers = "extra == \"python\""} +platformdirs = ">=2.2" +pymdown-extensions = ">=6.3" +typing-extensions = {version = ">=4.1", markers = "python_version < \"3.10\""} + +[package.extras] +crystal = ["mkdocstrings-crystal (>=0.3.4)"] +python = ["mkdocstrings-python (>=0.5.2)"] +python-legacy = ["mkdocstrings-python-legacy (>=0.2.1)"] + +[[package]] +name = "mkdocstrings-python" +version = "1.13.0" +description = "A Python handler for mkdocstrings." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "mkdocstrings_python-1.13.0-py3-none-any.whl", hash = "sha256:b88bbb207bab4086434743849f8e796788b373bd32e7bfefbf8560ac45d88f97"}, + {file = "mkdocstrings_python-1.13.0.tar.gz", hash = "sha256:2dbd5757e8375b9720e81db16f52f1856bf59905428fd7ef88005d1370e2f64c"}, +] + +[package.dependencies] +griffe = ">=0.49" +mkdocs-autorefs = ">=1.2" +mkdocstrings = ">=0.26" + [[package]] name = "mypy" version = "1.18.2" @@ -736,31 +1407,172 @@ version = "24.2" description = "Core utilities for Python packages" optional = false python-versions = ">=3.8" -groups = ["dev"] +groups = ["dev", "docs"] files = [ {file = "packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759"}, {file = "packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f"}, ] +[[package]] +name = "paginate" +version = "0.5.7" +description = "Divides large result sets into pages for easier browsing" +optional = false +python-versions = "*" +groups = ["docs"] +files = [ + {file = "paginate-0.5.7-py2.py3-none-any.whl", hash = "sha256:b885e2af73abcf01d9559fd5216b57ef722f8c42affbb63942377668e35c7591"}, + {file = "paginate-0.5.7.tar.gz", hash = "sha256:22bd083ab41e1a8b4f3690544afb2c60c25e5c9a63a30fa2f483f6c60c8e5945"}, +] + +[package.extras] +dev = ["pytest", "tox"] +lint = ["black"] + [[package]] name = "pathspec" version = "0.12.1" description = "Utility library for gitignore style pattern matching of file paths." optional = false python-versions = ">=3.8" -groups = ["dev"] +groups = ["dev", "docs"] files = [ {file = "pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08"}, {file = "pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712"}, ] +[[package]] +name = "pillow" +version = "11.3.0" +description = "Python Imaging Library (Fork)" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "pillow-11.3.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:1b9c17fd4ace828b3003dfd1e30bff24863e0eb59b535e8f80194d9cc7ecf860"}, + {file = "pillow-11.3.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:65dc69160114cdd0ca0f35cb434633c75e8e7fad4cf855177a05bf38678f73ad"}, + {file = "pillow-11.3.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7107195ddc914f656c7fc8e4a5e1c25f32e9236ea3ea860f257b0436011fddd0"}, + {file = "pillow-11.3.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:cc3e831b563b3114baac7ec2ee86819eb03caa1a2cef0b481a5675b59c4fe23b"}, + {file = "pillow-11.3.0-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f1f182ebd2303acf8c380a54f615ec883322593320a9b00438eb842c1f37ae50"}, + {file = "pillow-11.3.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4445fa62e15936a028672fd48c4c11a66d641d2c05726c7ec1f8ba6a572036ae"}, + {file = "pillow-11.3.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:71f511f6b3b91dd543282477be45a033e4845a40278fa8dcdbfdb07109bf18f9"}, + {file = "pillow-11.3.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:040a5b691b0713e1f6cbe222e0f4f74cd233421e105850ae3b3c0ceda520f42e"}, + {file = "pillow-11.3.0-cp310-cp310-win32.whl", hash = "sha256:89bd777bc6624fe4115e9fac3352c79ed60f3bb18651420635f26e643e3dd1f6"}, + {file = "pillow-11.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:19d2ff547c75b8e3ff46f4d9ef969a06c30ab2d4263a9e287733aa8b2429ce8f"}, + {file = "pillow-11.3.0-cp310-cp310-win_arm64.whl", hash = "sha256:819931d25e57b513242859ce1876c58c59dc31587847bf74cfe06b2e0cb22d2f"}, + {file = "pillow-11.3.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:1cd110edf822773368b396281a2293aeb91c90a2db00d78ea43e7e861631b722"}, + {file = "pillow-11.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9c412fddd1b77a75aa904615ebaa6001f169b26fd467b4be93aded278266b288"}, + {file = "pillow-11.3.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7d1aa4de119a0ecac0a34a9c8bde33f34022e2e8f99104e47a3ca392fd60e37d"}, + {file = "pillow-11.3.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:91da1d88226663594e3f6b4b8c3c8d85bd504117d043740a8e0ec449087cc494"}, + {file = "pillow-11.3.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:643f189248837533073c405ec2f0bb250ba54598cf80e8c1e043381a60632f58"}, + {file = "pillow-11.3.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:106064daa23a745510dabce1d84f29137a37224831d88eb4ce94bb187b1d7e5f"}, + {file = "pillow-11.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cd8ff254faf15591e724dc7c4ddb6bf4793efcbe13802a4ae3e863cd300b493e"}, + {file = "pillow-11.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:932c754c2d51ad2b2271fd01c3d121daaa35e27efae2a616f77bf164bc0b3e94"}, + {file = "pillow-11.3.0-cp311-cp311-win32.whl", hash = "sha256:b4b8f3efc8d530a1544e5962bd6b403d5f7fe8b9e08227c6b255f98ad82b4ba0"}, + {file = "pillow-11.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:1a992e86b0dd7aeb1f053cd506508c0999d710a8f07b4c791c63843fc6a807ac"}, + {file = "pillow-11.3.0-cp311-cp311-win_arm64.whl", hash = "sha256:30807c931ff7c095620fe04448e2c2fc673fcbb1ffe2a7da3fb39613489b1ddd"}, + {file = "pillow-11.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdae223722da47b024b867c1ea0be64e0df702c5e0a60e27daad39bf960dd1e4"}, + {file = "pillow-11.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:921bd305b10e82b4d1f5e802b6850677f965d8394203d182f078873851dada69"}, + {file = "pillow-11.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:eb76541cba2f958032d79d143b98a3a6b3ea87f0959bbe256c0b5e416599fd5d"}, + {file = "pillow-11.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:67172f2944ebba3d4a7b54f2e95c786a3a50c21b88456329314caaa28cda70f6"}, + {file = "pillow-11.3.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97f07ed9f56a3b9b5f49d3661dc9607484e85c67e27f3e8be2c7d28ca032fec7"}, + {file = "pillow-11.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:676b2815362456b5b3216b4fd5bd89d362100dc6f4945154ff172e206a22c024"}, + {file = "pillow-11.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3e184b2f26ff146363dd07bde8b711833d7b0202e27d13540bfe2e35a323a809"}, + {file = "pillow-11.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6be31e3fc9a621e071bc17bb7de63b85cbe0bfae91bb0363c893cbe67247780d"}, + {file = "pillow-11.3.0-cp312-cp312-win32.whl", hash = "sha256:7b161756381f0918e05e7cb8a371fff367e807770f8fe92ecb20d905d0e1c149"}, + {file = "pillow-11.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a6444696fce635783440b7f7a9fc24b3ad10a9ea3f0ab66c5905be1c19ccf17d"}, + {file = "pillow-11.3.0-cp312-cp312-win_arm64.whl", hash = "sha256:2aceea54f957dd4448264f9bf40875da0415c83eb85f55069d89c0ed436e3542"}, + {file = "pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:1c627742b539bba4309df89171356fcb3cc5a9178355b2727d1b74a6cf155fbd"}, + {file = "pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:30b7c02f3899d10f13d7a48163c8969e4e653f8b43416d23d13d1bbfdc93b9f8"}, + {file = "pillow-11.3.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:7859a4cc7c9295f5838015d8cc0a9c215b77e43d07a25e460f35cf516df8626f"}, + {file = "pillow-11.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ec1ee50470b0d050984394423d96325b744d55c701a439d2bd66089bff963d3c"}, + {file = "pillow-11.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7db51d222548ccfd274e4572fdbf3e810a5e66b00608862f947b163e613b67dd"}, + {file = "pillow-11.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2d6fcc902a24ac74495df63faad1884282239265c6839a0a6416d33faedfae7e"}, + {file = "pillow-11.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f0f5d8f4a08090c6d6d578351a2b91acf519a54986c055af27e7a93feae6d3f1"}, + {file = "pillow-11.3.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c37d8ba9411d6003bba9e518db0db0c58a680ab9fe5179f040b0463644bc9805"}, + {file = "pillow-11.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:13f87d581e71d9189ab21fe0efb5a23e9f28552d5be6979e84001d3b8505abe8"}, + {file = "pillow-11.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:023f6d2d11784a465f09fd09a34b150ea4672e85fb3d05931d89f373ab14abb2"}, + {file = "pillow-11.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:45dfc51ac5975b938e9809451c51734124e73b04d0f0ac621649821a63852e7b"}, + {file = "pillow-11.3.0-cp313-cp313-win32.whl", hash = "sha256:a4d336baed65d50d37b88ca5b60c0fa9d81e3a87d4a7930d3880d1624d5b31f3"}, + {file = "pillow-11.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bce5c4fd0921f99d2e858dc4d4d64193407e1b99478bc5cacecba2311abde51"}, + {file = "pillow-11.3.0-cp313-cp313-win_arm64.whl", hash = "sha256:1904e1264881f682f02b7f8167935cce37bc97db457f8e7849dc3a6a52b99580"}, + {file = "pillow-11.3.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4c834a3921375c48ee6b9624061076bc0a32a60b5532b322cc0ea64e639dd50e"}, + {file = "pillow-11.3.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e05688ccef30ea69b9317a9ead994b93975104a677a36a8ed8106be9260aa6d"}, + {file = "pillow-11.3.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1019b04af07fc0163e2810167918cb5add8d74674b6267616021ab558dc98ced"}, + {file = "pillow-11.3.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f944255db153ebb2b19c51fe85dd99ef0ce494123f21b9db4877ffdfc5590c7c"}, + {file = "pillow-11.3.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f85acb69adf2aaee8b7da124efebbdb959a104db34d3a2cb0f3793dbae422a8"}, + {file = "pillow-11.3.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:05f6ecbeff5005399bb48d198f098a9b4b6bdf27b8487c7f38ca16eeb070cd59"}, + {file = "pillow-11.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a7bc6e6fd0395bc052f16b1a8670859964dbd7003bd0af2ff08342eb6e442cfe"}, + {file = "pillow-11.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83e1b0161c9d148125083a35c1c5a89db5b7054834fd4387499e06552035236c"}, + {file = "pillow-11.3.0-cp313-cp313t-win32.whl", hash = "sha256:2a3117c06b8fb646639dce83694f2f9eac405472713fcb1ae887469c0d4f6788"}, + {file = "pillow-11.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:857844335c95bea93fb39e0fa2726b4d9d758850b34075a7e3ff4f4fa3aa3b31"}, + {file = "pillow-11.3.0-cp313-cp313t-win_arm64.whl", hash = "sha256:8797edc41f3e8536ae4b10897ee2f637235c94f27404cac7297f7b607dd0716e"}, + {file = "pillow-11.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d9da3df5f9ea2a89b81bb6087177fb1f4d1c7146d583a3fe5c672c0d94e55e12"}, + {file = "pillow-11.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0b275ff9b04df7b640c59ec5a3cb113eefd3795a8df80bac69646ef699c6981a"}, + {file = "pillow-11.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0743841cabd3dba6a83f38a92672cccbd69af56e3e91777b0ee7f4dba4385632"}, + {file = "pillow-11.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2465a69cf967b8b49ee1b96d76718cd98c4e925414ead59fdf75cf0fd07df673"}, + {file = "pillow-11.3.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41742638139424703b4d01665b807c6468e23e699e8e90cffefe291c5832b027"}, + {file = "pillow-11.3.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:93efb0b4de7e340d99057415c749175e24c8864302369e05914682ba642e5d77"}, + {file = "pillow-11.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7966e38dcd0fa11ca390aed7c6f20454443581d758242023cf36fcb319b1a874"}, + {file = "pillow-11.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:98a9afa7b9007c67ed84c57c9e0ad86a6000da96eaa638e4f8abe5b65ff83f0a"}, + {file = "pillow-11.3.0-cp314-cp314-win32.whl", hash = "sha256:02a723e6bf909e7cea0dac1b0e0310be9d7650cd66222a5f1c571455c0a45214"}, + {file = "pillow-11.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a418486160228f64dd9e9efcd132679b7a02a5f22c982c78b6fc7dab3fefb635"}, + {file = "pillow-11.3.0-cp314-cp314-win_arm64.whl", hash = "sha256:155658efb5e044669c08896c0c44231c5e9abcaadbc5cd3648df2f7c0b96b9a6"}, + {file = "pillow-11.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:59a03cdf019efbfeeed910bf79c7c93255c3d54bc45898ac2a4140071b02b4ae"}, + {file = "pillow-11.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f8a5827f84d973d8636e9dc5764af4f0cf2318d26744b3d902931701b0d46653"}, + {file = "pillow-11.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ee92f2fd10f4adc4b43d07ec5e779932b4eb3dbfbc34790ada5a6669bc095aa6"}, + {file = "pillow-11.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c96d333dcf42d01f47b37e0979b6bd73ec91eae18614864622d9b87bbd5bbf36"}, + {file = "pillow-11.3.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4c96f993ab8c98460cd0c001447bff6194403e8b1d7e149ade5f00594918128b"}, + {file = "pillow-11.3.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41342b64afeba938edb034d122b2dda5db2139b9a4af999729ba8818e0056477"}, + {file = "pillow-11.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:068d9c39a2d1b358eb9f245ce7ab1b5c3246c7c8c7d9ba58cfa5b43146c06e50"}, + {file = "pillow-11.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a1bc6ba083b145187f648b667e05a2534ecc4b9f2784c2cbe3089e44868f2b9b"}, + {file = "pillow-11.3.0-cp314-cp314t-win32.whl", hash = "sha256:118ca10c0d60b06d006be10a501fd6bbdfef559251ed31b794668ed569c87e12"}, + {file = "pillow-11.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:8924748b688aa210d79883357d102cd64690e56b923a186f35a82cbc10f997db"}, + {file = "pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa"}, + {file = "pillow-11.3.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:48d254f8a4c776de343051023eb61ffe818299eeac478da55227d96e241de53f"}, + {file = "pillow-11.3.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:7aee118e30a4cf54fdd873bd3a29de51e29105ab11f9aad8c32123f58c8f8081"}, + {file = "pillow-11.3.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:23cff760a9049c502721bdb743a7cb3e03365fafcdfc2ef9784610714166e5a4"}, + {file = "pillow-11.3.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:6359a3bc43f57d5b375d1ad54a0074318a0844d11b76abccf478c37c986d3cfc"}, + {file = "pillow-11.3.0-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:092c80c76635f5ecb10f3f83d76716165c96f5229addbd1ec2bdbbda7d496e06"}, + {file = "pillow-11.3.0-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cadc9e0ea0a2431124cde7e1697106471fc4c1da01530e679b2391c37d3fbb3a"}, + {file = "pillow-11.3.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:6a418691000f2a418c9135a7cf0d797c1bb7d9a485e61fe8e7722845b95ef978"}, + {file = "pillow-11.3.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:97afb3a00b65cc0804d1c7abddbf090a81eaac02768af58cbdcaaa0a931e0b6d"}, + {file = "pillow-11.3.0-cp39-cp39-win32.whl", hash = "sha256:ea944117a7974ae78059fcc1800e5d3295172bb97035c0c1d9345fca1419da71"}, + {file = "pillow-11.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:e5c5858ad8ec655450a7c7df532e9842cf8df7cc349df7225c60d5d348c8aada"}, + {file = "pillow-11.3.0-cp39-cp39-win_arm64.whl", hash = "sha256:6abdbfd3aea42be05702a8dd98832329c167ee84400a1d1f61ab11437f1717eb"}, + {file = "pillow-11.3.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:3cee80663f29e3843b68199b9d6f4f54bd1d4a6b59bdd91bceefc51238bcb967"}, + {file = "pillow-11.3.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b5f56c3f344f2ccaf0dd875d3e180f631dc60a51b314295a3e681fe8cf851fbe"}, + {file = "pillow-11.3.0-pp310-pypy310_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:e67d793d180c9df62f1f40aee3accca4829d3794c95098887edc18af4b8b780c"}, + {file = "pillow-11.3.0-pp310-pypy310_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d000f46e2917c705e9fb93a3606ee4a819d1e3aa7a9b442f6444f07e77cf5e25"}, + {file = "pillow-11.3.0-pp310-pypy310_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:527b37216b6ac3a12d7838dc3bd75208ec57c1c6d11ef01902266a5a0c14fc27"}, + {file = "pillow-11.3.0-pp310-pypy310_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:be5463ac478b623b9dd3937afd7fb7ab3d79dd290a28e2b6df292dc75063eb8a"}, + {file = "pillow-11.3.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:8dc70ca24c110503e16918a658b869019126ecfe03109b754c402daff12b3d9f"}, + {file = "pillow-11.3.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7c8ec7a017ad1bd562f93dbd8505763e688d388cde6e4a010ae1486916e713e6"}, + {file = "pillow-11.3.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:9ab6ae226de48019caa8074894544af5b53a117ccb9d3b3dcb2871464c829438"}, + {file = "pillow-11.3.0-pp311-pypy311_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:fe27fb049cdcca11f11a7bfda64043c37b30e6b91f10cb5bab275806c32f6ab3"}, + {file = "pillow-11.3.0-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:465b9e8844e3c3519a983d58b80be3f668e2a7a5db97f2784e7079fbc9f9822c"}, + {file = "pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5418b53c0d59b3824d05e029669efa023bbef0f3e92e75ec8428f3799487f361"}, + {file = "pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:504b6f59505f08ae014f724b6207ff6222662aab5cc9542577fb084ed0676ac7"}, + {file = "pillow-11.3.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:c84d689db21a1c397d001aa08241044aa2069e7587b398c8cc63020390b1c1b8"}, + {file = "pillow-11.3.0.tar.gz", hash = "sha256:3828ee7586cd0b2091b6209e5ad53e20d0649bbe87164a459d0676e035e8f523"}, +] + +[package.extras] +docs = ["furo", "olefile", "sphinx (>=8.2)", "sphinx-autobuild", "sphinx-copybutton", "sphinx-inline-tabs", "sphinxext-opengraph"] +fpx = ["olefile"] +mic = ["olefile"] +test-arrow = ["pyarrow"] +tests = ["check-manifest", "coverage (>=7.4.2)", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout", "pytest-xdist", "trove-classifiers (>=2024.10.12)"] +typing = ["typing-extensions ; python_version < \"3.10\""] +xmp = ["defusedxml"] + [[package]] name = "platformdirs" version = "4.3.6" description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`." optional = false python-versions = ">=3.8" -groups = ["dev"] +groups = ["dev", "docs"] files = [ {file = "platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb"}, {file = "platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907"}, @@ -901,6 +1713,68 @@ files = [ [package.dependencies] pyasn1 = ">=0.4.6,<0.7.0" +[[package]] +name = "pycparser" +version = "2.23" +description = "C parser in Python" +optional = false +python-versions = ">=3.8" +groups = ["docs"] +markers = "implementation_name != \"PyPy\"" +files = [ + {file = "pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934"}, + {file = "pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2"}, +] + +[[package]] +name = "pygments" +version = "2.19.2" +description = "Pygments is a syntax highlighting package written in Python." +optional = false +python-versions = ">=3.8" +groups = ["docs"] +files = [ + {file = "pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b"}, + {file = "pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887"}, +] + +[package.extras] +windows-terminal = ["colorama (>=0.4.6)"] + +[[package]] +name = "pymdown-extensions" +version = "10.20" +description = "Extension pack for Python Markdown." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "pymdown_extensions-10.20-py3-none-any.whl", hash = "sha256:ea9e62add865da80a271d00bfa1c0fa085b20d133fb3fc97afdc88e682f60b2f"}, + {file = "pymdown_extensions-10.20.tar.gz", hash = "sha256:5c73566ab0cf38c6ba084cb7c5ea64a119ae0500cce754ccb682761dfea13a52"}, +] + +[package.dependencies] +markdown = ">=3.6" +pyyaml = "*" + +[package.extras] +extra = ["pygments (>=2.19.1)"] + +[[package]] +name = "pyparsing" +version = "3.3.1" +description = "pyparsing - Classes and methods to define and execute parsing grammars" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "pyparsing-3.3.1-py3-none-any.whl", hash = "sha256:023b5e7e5520ad96642e2c6db4cb683d3970bd640cdf7115049a6e9c3682df82"}, + {file = "pyparsing-3.3.1.tar.gz", hash = "sha256:47fad0f17ac1e2cad3de3b458570fbc9b03560aa029ed5e16ee5554da9a2251c"}, +] + +[package.extras] +diagrams = ["jinja2", "railroad-diagrams"] + [[package]] name = "pytest" version = "7.4.4" @@ -943,13 +1817,28 @@ pytest = ">=4.6" [package.extras] testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtualenv"] +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +description = "Extensions to the standard Python datetime module" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["docs"] +files = [ + {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"}, + {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"}, +] + +[package.dependencies] +six = ">=1.5" + [[package]] name = "pyyaml" version = "6.0.2" description = "YAML parser and emitter for Python" optional = false python-versions = ">=3.8" -groups = ["dev"] +groups = ["dev", "docs"] files = [ {file = "PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086"}, {file = "PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf"}, @@ -1006,13 +1895,28 @@ files = [ {file = "pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e"}, ] +[[package]] +name = "pyyaml-env-tag" +version = "1.1" +description = "A custom YAML tag for referencing environment variables in YAML files." +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "pyyaml_env_tag-1.1-py3-none-any.whl", hash = "sha256:17109e1a528561e32f026364712fee1264bc2ea6715120891174ed1b980d2e04"}, + {file = "pyyaml_env_tag-1.1.tar.gz", hash = "sha256:2eb38b75a2d21ee0475d6d97ec19c63287a7e140231e4214969d0eac923cd7ff"}, +] + +[package.dependencies] +pyyaml = "*" + [[package]] name = "requests" version = "2.32.3" description = "Python HTTP for Humans." optional = false python-versions = ">=3.8" -groups = ["main"] +groups = ["main", "docs"] files = [ {file = "requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"}, {file = "requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760"}, @@ -1091,6 +1995,58 @@ enabler = ["pytest-enabler (>=2.2)"] test = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "ini2toml[lite] (>=0.14)", "jaraco.develop (>=7.21) ; python_version >= \"3.9\" and sys_platform != \"cygwin\"", "jaraco.envs (>=2.2)", "jaraco.path (>=3.7.2)", "jaraco.test (>=5.5)", "packaging (>=24.2)", "pip (>=19.1)", "pyproject-hooks (!=1.1)", "pytest (>=6,!=8.1.*)", "pytest-home (>=0.5)", "pytest-perf ; sys_platform != \"cygwin\"", "pytest-subprocess", "pytest-timeout", "pytest-xdist (>=3)", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel (>=0.44.0)"] type = ["importlib_metadata (>=7.0.2) ; python_version < \"3.10\"", "jaraco.develop (>=7.21) ; sys_platform != \"cygwin\"", "mypy (==1.14.*)", "pytest-mypy"] +[[package]] +name = "six" +version = "1.17.0" +description = "Python 2 and 3 compatibility utilities" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["docs"] +files = [ + {file = "six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274"}, + {file = "six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"}, +] + +[[package]] +name = "tinycss2" +version = "1.4.0" +description = "A tiny CSS parser" +optional = false +python-versions = ">=3.8" +groups = ["docs"] +markers = "python_version < \"3.13\"" +files = [ + {file = "tinycss2-1.4.0-py3-none-any.whl", hash = "sha256:3a49cf47b7675da0b15d0c6e1df8df4ebd96e9394bb905a5775adb0d884c5289"}, + {file = "tinycss2-1.4.0.tar.gz", hash = "sha256:10c0972f6fc0fbee87c3edb76549357415e94548c1ae10ebccdea16fb404a9b7"}, +] + +[package.dependencies] +webencodings = ">=0.4" + +[package.extras] +doc = ["sphinx", "sphinx_rtd_theme"] +test = ["pytest", "ruff"] + +[[package]] +name = "tinycss2" +version = "1.5.1" +description = "A tiny CSS parser" +optional = false +python-versions = ">=3.10" +groups = ["docs"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "tinycss2-1.5.1-py3-none-any.whl", hash = "sha256:3415ba0f5839c062696996998176c4a3751d18b7edaaeeb658c9ce21ec150661"}, + {file = "tinycss2-1.5.1.tar.gz", hash = "sha256:d339d2b616ba90ccce58da8495a78f46e55d4d25f9fd71dfd526f07e7d53f957"}, +] + +[package.dependencies] +webencodings = ">=0.4" + +[package.extras] +doc = ["furo", "sphinx"] +test = ["pytest", "ruff"] + [[package]] name = "tomli" version = "2.2.1" @@ -1164,11 +2120,12 @@ version = "4.12.2" description = "Backported and Experimental Type Hints for Python 3.8+" optional = false python-versions = ">=3.8" -groups = ["main", "dev"] +groups = ["main", "dev", "docs"] files = [ {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"}, {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"}, ] +markers = {docs = "python_version == \"3.9\""} [[package]] name = "urllib3" @@ -1176,7 +2133,7 @@ version = "2.3.0" description = "HTTP library with thread-safe connection pooling, file post, and more." optional = false python-versions = ">=3.9" -groups = ["main"] +groups = ["main", "docs"] files = [ {file = "urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df"}, {file = "urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d"}, @@ -1240,6 +2197,21 @@ dev = ["Cython (>=3.0,<4.0)", "setuptools (>=60)"] docs = ["Sphinx (>=4.1.2,<4.2.0)", "sphinx-rtd-theme (>=0.5.2,<0.6.0)", "sphinxcontrib-asyncio (>=0.3.0,<0.4.0)"] test = ["aiohttp (>=3.10.5)", "flake8 (>=5.0,<6.0)", "mypy (>=0.800)", "psutil", "pyOpenSSL (>=23.0.0,<23.1.0)", "pycodestyle (>=2.9.0,<2.10.0)"] +[[package]] +name = "verspec" +version = "0.1.0" +description = "Flexible version handling" +optional = false +python-versions = "*" +groups = ["docs"] +files = [ + {file = "verspec-0.1.0-py3-none-any.whl", hash = "sha256:741877d5633cc9464c45a469ae2a31e801e6dbbaa85b9675d481cda100f11c31"}, + {file = "verspec-0.1.0.tar.gz", hash = "sha256:c4504ca697b2056cdb4bfa7121461f5a0e81809255b41c03dda4ba823637c01e"}, +] + +[package.extras] +test = ["coverage", "flake8 (>=3.7)", "mypy", "pretend", "pytest"] + [[package]] name = "virtualenv" version = "20.29.1" @@ -1261,7 +2233,82 @@ platformdirs = ">=3.9.1,<5" docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2,!=7.3)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"] test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8) ; platform_python_implementation == \"PyPy\" or platform_python_implementation == \"CPython\" and sys_platform == \"win32\" and python_version >= \"3.13\"", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10) ; platform_python_implementation == \"CPython\""] +[[package]] +name = "watchdog" +version = "6.0.0" +description = "Filesystem events monitoring" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "watchdog-6.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d1cdb490583ebd691c012b3d6dae011000fe42edb7a82ece80965b42abd61f26"}, + {file = "watchdog-6.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc64ab3bdb6a04d69d4023b29422170b74681784ffb9463ed4870cf2f3e66112"}, + {file = "watchdog-6.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c897ac1b55c5a1461e16dae288d22bb2e412ba9807df8397a635d88f671d36c3"}, + {file = "watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c"}, + {file = "watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2"}, + {file = "watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c"}, + {file = "watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948"}, + {file = "watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860"}, + {file = "watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0"}, + {file = "watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c"}, + {file = "watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134"}, + {file = "watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b"}, + {file = "watchdog-6.0.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:e6f0e77c9417e7cd62af82529b10563db3423625c5fce018430b249bf977f9e8"}, + {file = "watchdog-6.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:90c8e78f3b94014f7aaae121e6b909674df5b46ec24d6bebc45c44c56729af2a"}, + {file = "watchdog-6.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e7631a77ffb1f7d2eefa4445ebbee491c720a5661ddf6df3498ebecae5ed375c"}, + {file = "watchdog-6.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:c7ac31a19f4545dd92fc25d200694098f42c9a8e391bc00bdd362c5736dbf881"}, + {file = "watchdog-6.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:9513f27a1a582d9808cf21a07dae516f0fab1cf2d7683a742c498b93eedabb11"}, + {file = "watchdog-6.0.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7a0e56874cfbc4b9b05c60c8a1926fedf56324bb08cfbc188969777940aef3aa"}, + {file = "watchdog-6.0.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:e6439e374fc012255b4ec786ae3c4bc838cd7309a540e5fe0952d03687d8804e"}, + {file = "watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13"}, + {file = "watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379"}, + {file = "watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e"}, + {file = "watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f"}, + {file = "watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26"}, + {file = "watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c"}, + {file = "watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2"}, + {file = "watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a"}, + {file = "watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680"}, + {file = "watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f"}, + {file = "watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282"}, +] + +[package.extras] +watchmedo = ["PyYAML (>=3.10)"] + +[[package]] +name = "webencodings" +version = "0.5.1" +description = "Character encoding aliases for legacy web content" +optional = false +python-versions = "*" +groups = ["docs"] +files = [ + {file = "webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78"}, + {file = "webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923"}, +] + +[[package]] +name = "zipp" +version = "3.23.0" +description = "Backport of pathlib-compatible object wrapper for zip files" +optional = false +python-versions = ">=3.9" +groups = ["docs"] +files = [ + {file = "zipp-3.23.0-py3-none-any.whl", hash = "sha256:071652d6115ed432f5ce1d34c336c0adfd6a884660d1e9712a256d3d3bd4b14e"}, + {file = "zipp-3.23.0.tar.gz", hash = "sha256:a07157588a12518c9d4034df3fbbee09c814741a33ff63c05fa29d26a2404166"}, +] + +[package.extras] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1) ; sys_platform != \"cygwin\""] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +enabler = ["pytest-enabler (>=2.2)"] +test = ["big-O", "jaraco.functools", "jaraco.itertools", "jaraco.test", "more_itertools", "pytest (>=6,!=8.1.*)", "pytest-ignore-flaky"] +type = ["pytest-mypy"] + [metadata] lock-version = "2.1" python-versions = ">=3.9, <3.15" -content-hash = "00b1a973942e94a9f9f568cd3e1e7053162cad9fcf7c73d33fd9fda9e2964fc9" +content-hash = "fd5a94a16508f922e5799b33c651472e0b329834d901612afcf40808a2c96cfd" diff --git a/packages/pynumaflow/pynumaflow/accumulator/_dtypes.py b/packages/pynumaflow/pynumaflow/accumulator/_dtypes.py index 52b36a13..f55b0d22 100644 --- a/packages/pynumaflow/pynumaflow/accumulator/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/accumulator/_dtypes.py @@ -26,25 +26,26 @@ class WindowOperation(IntEnum): class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. event_time: the event time of the event. watermark: the watermark of the event. - >>> # Example usage - >>> from pynumaflow.accumulator import Datum - >>> from datetime import datetime, timezone - >>> payload = bytes("test_mock_message", encoding="utf-8") - >>> t1 = datetime.fromtimestamp(1662998400, timezone.utc) - >>> t2 = datetime.fromtimestamp(1662998460, timezone.utc) - >>> msg_headers = {"key1": "value1", "key2": "value2"} - >>> d = Datum( - ... keys=["test_key"], - ... value=payload, - ... event_time=t1, - ... watermark=t2, - ... headers=msg_headers - ... ) + + Example usage + ```py + from pynumaflow.accumulator import Datum + from datetime import datetime, timezone + + d = Datum( + keys=["test_key"], + value=b"test_mock_message", + event_time=datetime.fromtimestamp(1662998400, timezone.utc), + watermark=datetime.fromtimestamp(1662998460, timezone.utc), + headers={"key1": "value1", "key2": "value2"}, + ) + ``` """ __slots__ = ("_keys", "_value", "_event_time", "_watermark", "_headers", "_id") diff --git a/packages/pynumaflow/pynumaflow/accumulator/async_server.py b/packages/pynumaflow/pynumaflow/accumulator/async_server.py index 042359ca..50e08468 100644 --- a/packages/pynumaflow/pynumaflow/accumulator/async_server.py +++ b/packages/pynumaflow/pynumaflow/accumulator/async_server.py @@ -61,6 +61,7 @@ class AccumulatorAsyncServer(NumaflowServer): Class for a new Accumulator Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: accumulator_instance: The accumulator instance to be used for Accumulator UDF @@ -72,92 +73,74 @@ class AccumulatorAsyncServer(NumaflowServer): max_threads: The max number of threads to be spawned; defaults to 4 and max capped at 16 server_info_file: The path to the server info file - Example invocation: - import os - from collections.abc import AsyncIterable - from datetime import datetime - - from pynumaflow.accumulator import Accumulator, AccumulatorAsyncServer - from pynumaflow.accumulator import ( - Message, - Datum, - ) - from pynumaflow.shared.asynciter import NonBlockingIterator - - class StreamSorter(Accumulator): - def __init__(self, counter): - self.latest_wm = datetime.fromtimestamp(-1) - self.sorted_buffer: list[Datum] = [] - - async def handler( - self, - datums: AsyncIterable[Datum], - output: NonBlockingIterator, - ): - async for _ in datums: - # Process the datums and send output - if datum.watermark and datum.watermark > self.latest_wm: - self.latest_wm = datum.watermark - await self.flush_buffer(output) - - self.insert_sorted(datum) - - def insert_sorted(self, datum: Datum): - # Binary insert to keep sorted buffer in order - left, right = 0, len(self.sorted_buffer) - while left < right: - mid = (left + right) // 2 - if self.sorted_buffer[mid].event_time > datum.event_time: - right = mid - else: - left = mid + 1 - self.sorted_buffer.insert(left, datum) - - async def flush_buffer(self, output: NonBlockingIterator): - i = 0 - for datum in self.sorted_buffer: - if datum.event_time > self.latest_wm: - break - await output.put(Message.from_datum(datum)) - i += 1 - # Remove flushed items - self.sorted_buffer = self.sorted_buffer[i:] - - - if __name__ == "__main__": - grpc_server = AccumulatorAsyncServer(StreamSorter) - grpc_server.start() + Example invocation: + ```py + import os + from collections.abc import AsyncIterable + from datetime import datetime + + from pynumaflow.accumulator import Accumulator, AccumulatorAsyncServer + from pynumaflow.accumulator import Message, Datum + from pynumaflow.shared.asynciter import NonBlockingIterator + + class StreamSorter(Accumulator): + def __init__(self, counter): + self.latest_wm = datetime.fromtimestamp(-1) + self.sorted_buffer: list[Datum] = [] + + async def handler( + self, + datums: AsyncIterable[Datum], + output: NonBlockingIterator, + ): + async for _ in datums: + # Process the datums and send output + if datum.watermark and datum.watermark > self.latest_wm: + self.latest_wm = datum.watermark + await self.flush_buffer(output) + + self.insert_sorted(datum) + + def insert_sorted(self, datum: Datum): + # Binary insert to keep sorted buffer in order + left, right = 0, len(self.sorted_buffer) + while left < right: + mid = (left + right) // 2 + if self.sorted_buffer[mid].event_time > datum.event_time: + right = mid + else: + left = mid + 1 + self.sorted_buffer.insert(left, datum) + + async def flush_buffer(self, output: NonBlockingIterator): + i = 0 + for datum in self.sorted_buffer: + if datum.event_time > self.latest_wm: + break + await output.put(Message.from_datum(datum)) + i += 1 + # Remove flushed items + self.sorted_buffer = self.sorted_buffer[i:] + + + if __name__ == "__main__": + grpc_server = AccumulatorAsyncServer(StreamSorter) + grpc_server.start() + ``` """ def __init__( self, accumulator_instance: AccumulatorStreamCallable, init_args: tuple = (), - init_kwargs: dict = None, + init_kwargs: Optional[dict] = None, sock_path=ACCUMULATOR_SOCK_PATH, max_message_size=MAX_MESSAGE_SIZE, max_threads=NUM_THREADS_DEFAULT, server_info_file=ACCUMULATOR_SERVER_INFO_FILE_PATH, ): - """ - Create a new grpc Accumulator Server instance. - A new servicer instance is created and attached to the server. - The server instance is returned. - Args: - accumulator_instance: The Accumulator instance to be used for - Accumulator UDF - init_args: The arguments to be passed to the accumulator_handler - init_kwargs: The keyword arguments to be passed to the - accumulator_handler - sock_path: The UNIX socket path to be used for the server - max_message_size: The max message size in bytes the server can receive and send - max_threads: The max number of threads to be spawned; - defaults to 4 and max capped at 16 - server_info_file: The path to the server info file - """ - if init_kwargs is None: - init_kwargs = {} + init_kwargs = init_kwargs or {} self.accumulator_handler = get_handler(accumulator_instance, init_args, init_kwargs) self.sock_path = f"unix://{sock_path}" self.max_message_size = max_message_size diff --git a/packages/pynumaflow/pynumaflow/batchmapper/_dtypes.py b/packages/pynumaflow/pynumaflow/batchmapper/_dtypes.py index f8a9fb82..4e4dc270 100644 --- a/packages/pynumaflow/pynumaflow/batchmapper/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/batchmapper/_dtypes.py @@ -19,8 +19,8 @@ class Message: Args: value: data in bytes - keys: []string keys for vertex (optional) - tags: []string tags for conditional forwarding (optional) + keys: list of keys for vertex (optional) + tags: list of tags for conditional forwarding (optional) """ __slots__ = ("_value", "_keys", "_tags") @@ -61,6 +61,7 @@ def tags(self) -> list[str]: class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. diff --git a/packages/pynumaflow/pynumaflow/batchmapper/async_server.py b/packages/pynumaflow/pynumaflow/batchmapper/async_server.py index 73914bdd..5f1f6c91 100644 --- a/packages/pynumaflow/pynumaflow/batchmapper/async_server.py +++ b/packages/pynumaflow/pynumaflow/batchmapper/async_server.py @@ -39,6 +39,7 @@ def __init__( Create a new grpc Async Batch Map Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: batch_mapper_instance: The batch map stream instance to be used for Batch Map UDF sock_path: The UNIX socket path to be used for the server @@ -47,30 +48,31 @@ def __init__( defaults to 4 and max capped at 16 Example invocation: - class Flatmap(BatchMapper): - async def handler( - self, - datums: AsyncIterable[Datum], - ) -> BatchResponses: - batch_responses = BatchResponses() - async for datum in datums: - val = datum.value - _ = datum.event_time - _ = datum.watermark - strs = val.decode("utf-8").split(",") - batch_response = BatchResponse.from_id(datum.id) - if len(strs) == 0: - batch_response.append(Message.to_drop()) - else: - for s in strs: - batch_response.append(Message(str.encode(s))) - batch_responses.append(batch_response) - - return batch_responses + ```py + class Flatmap(BatchMapper): + async def handler( + self, + datums: AsyncIterable[Datum], + ) -> BatchResponses: + batch_responses = BatchResponses() + async for datum in datums: + val = datum.value + _ = datum.event_time + _ = datum.watermark + strs = val.decode("utf-8").split(",") + batch_response = BatchResponse.from_id(datum.id) + if len(strs) == 0: + batch_response.append(Message.to_drop()) + else: + for s in strs: + batch_response.append(Message(str.encode(s))) + batch_responses.append(batch_response) + return batch_responses if __name__ == "__main__": grpc_server = BatchMapAsyncServer(Flatmap()) grpc_server.start() + ``` """ self.batch_mapper_instance: BatchMapCallable = batch_mapper_instance self.sock_path = f"unix://{sock_path}" diff --git a/packages/pynumaflow/pynumaflow/mapper/_dtypes.py b/packages/pynumaflow/pynumaflow/mapper/_dtypes.py index 155b71e7..c4a69aa5 100644 --- a/packages/pynumaflow/pynumaflow/mapper/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/mapper/_dtypes.py @@ -19,8 +19,8 @@ class Message: Args: value: data in bytes - keys: []string keys for vertex (optional) - tags: []string tags for conditional forwarding (optional) + keys: list of keys for the vertex (optional) + tags: list of tags for conditional forwarding (optional) user_metadata: metadata for the message (optional) """ @@ -34,8 +34,8 @@ class Message: def __init__( self, value: bytes, - keys: list[str] = None, - tags: list[str] = None, + keys: Optional[list[str]] = None, + tags: Optional[list[str]] = None, user_metadata: Optional[UserMetadata] = None, ): """ @@ -115,6 +115,7 @@ def items(self) -> Sequence[M]: class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. @@ -122,20 +123,19 @@ class Datum: watermark: the watermark of the event. headers: the headers of the event. - >>> # Example usage - >>> from pynumaflow.mapper import Datum - >>> from datetime import datetime, timezone - >>> payload = bytes("test_mock_message", encoding="utf-8") - >>> t1 = datetime.fromtimestamp(1662998400, timezone.utc) - >>> t2 = datetime.fromtimestamp(1662998460, timezone.utc) - >>> msg_headers = {"key1": "value1", "key2": "value2"} - >>> d = Datum( - ... keys=["test_key"], - ... value=payload, - ... event_time=t1, - ... watermark=t2, - ... headers=msg_headers, - ... ) + Example usage + ```py + from pynumaflow.mapper import Datum + from datetime import datetime, timezone + + d = Datum( + keys=["test_key"], + value=b'test_mock_message' + event_time=datetime.fromtimestamp(1662998400, timezone.utc), + watermark=datetime.fromtimestamp(1662998460, timezone.utc), + headers={"key1": "value1", "key2": "value2"}, + ) + ``` """ __slots__ = ( diff --git a/packages/pynumaflow/pynumaflow/mapper/async_server.py b/packages/pynumaflow/pynumaflow/mapper/async_server.py index 98b553d9..3ef1ed46 100644 --- a/packages/pynumaflow/pynumaflow/mapper/async_server.py +++ b/packages/pynumaflow/pynumaflow/mapper/async_server.py @@ -35,20 +35,22 @@ class MapAsyncServer(NumaflowServer): defaults to 4 and max capped at 16 Example invocation: - from pynumaflow.mapper import Messages, Message, Datum, MapAsyncServer - async def async_map_handler(keys: list[str], datum: Datum) -> Messages: - val = datum.value - msg = "payload:{} event_time:{} watermark:{}".format( - val.decode("utf-8"), - datum.event_time, - datum.watermark, - ) - val = bytes(msg, encoding="utf-8") - return Messages(Message(value=val, keys=keys)) - - if __name__ == "__main__": - grpc_server = MapAsyncServer(async_map_handler) - grpc_server.start() + ```py + from pynumaflow.mapper import Messages, Message, Datum, MapAsyncServer + + async def async_map_handler(keys: list[str], datum: Datum) -> Messages: + val = datum.value + msg = ( + f"payload:{val.decode('utf-8')} " + f"event_time:{datum.event_time} " + f"watermark:{datum.watermark}" + ) + return Messages(Message(value=msg.encode('utf-8'), keys=keys)) + + if __name__ == "__main__": + grpc_server = MapAsyncServer(async_map_handler) + grpc_server.start() + ``` """ def __init__( @@ -63,12 +65,13 @@ def __init__( Create a new grpc Asynchronous Map Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: - mapper_instance: The mapper instance to be used for Map UDF - sock_path: The UNIX socket path to be used for the server - max_message_size: The max message size in bytes the server can receive and send - max_threads: The max number of threads to be spawned; - defaults to 4 and max capped at 16 + mapper_instance: The mapper instance to be used for Map UDF + sock_path: The UNIX socket path to be used for the server + max_message_size: The max message size in bytes the server can receive and send + max_threads: The max number of threads to be spawned; + defaults to 4 and max capped at 16 """ self.sock_path = f"unix://{sock_path}" self.max_threads = min(max_threads, MAX_NUM_THREADS) diff --git a/packages/pynumaflow/pynumaflow/mapper/multiproc_server.py b/packages/pynumaflow/pynumaflow/mapper/multiproc_server.py index 3646d22e..5d68a96b 100644 --- a/packages/pynumaflow/pynumaflow/mapper/multiproc_server.py +++ b/packages/pynumaflow/pynumaflow/mapper/multiproc_server.py @@ -43,6 +43,7 @@ def __init__( Create a new grpc Multiproc Map Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: mapper_instance: The mapper instance to be used for Map UDF server_count: The number of grpc server instances to be forked for multiproc @@ -52,35 +53,36 @@ def __init__( defaults to 4 and max capped at 16 Example invocation: - import math - import os - from pynumaflow.mapper import Messages, Message, Datum, Mapper, MapMultiprocServer - - def is_prime(n): - for i in range(2, int(math.ceil(math.sqrt(n)))): - if n % i == 0: - return False - else: - return True + ```py + import math + import os + from pynumaflow.mapper import Messages, Message, Datum, Mapper, MapMultiprocServer - class PrimeMap(Mapper): - def handler(self, keys: list[str], datum: Datum) -> Messages: - val = datum.value - _ = datum.event_time - _ = datum.watermark - messages = Messages() - for i in range(2, 100000): - is_prime(i) - messages.append(Message(val, keys=keys)) - return messages + def is_prime(n): + for i in range(2, int(math.ceil(math.sqrt(n)))): + if n % i == 0: + return False + else: + return True - if __name__ == "__main__": - server_count = 2 - prime_class = PrimeMap() - # Server count is the number of server processes to start - grpc_server = MapMultiprocServer(prime_class, server_count=server_count) - grpc_server.start() + class PrimeMap(Mapper): + def handler(self, keys: list[str], datum: Datum) -> Messages: + val = datum.value + _ = datum.event_time + _ = datum.watermark + messages = Messages() + for i in range(2, 100000): + is_prime(i) + messages.append(Message(val, keys=keys)) + return messages + if __name__ == "__main__": + server_count = 2 + prime_class = PrimeMap() + # Server count is the number of server processes to start + grpc_server = MapMultiprocServer(prime_class, server_count=server_count) + grpc_server.start() + ``` """ self.sock_path = f"unix://{sock_path}" self.max_threads = min(max_threads, MAX_NUM_THREADS) @@ -104,9 +106,8 @@ def handler(self, keys: list[str], datum: Datum) -> Messages: def start(self) -> None: """ - Starts the N grpc servers gRPC serves on the with - given max threads. - where N = The number of CPUs or the value of the parameter server_count + Starts the N grpc servers gRPC serves on the with given max threads. + Here N = The number of CPUs or the value of the parameter server_count defined by the user. The max value is capped to 2 * CPU count. """ diff --git a/packages/pynumaflow/pynumaflow/mapper/sync_server.py b/packages/pynumaflow/pynumaflow/mapper/sync_server.py index f1b6c90a..9c2431b6 100644 --- a/packages/pynumaflow/pynumaflow/mapper/sync_server.py +++ b/packages/pynumaflow/pynumaflow/mapper/sync_server.py @@ -27,6 +27,7 @@ class MapServer(NumaflowServer): """ Create a new grpc Map Server instance. + Args: mapper_instance: The mapper instance to be used for Map UDF sock_path: The UNIX socket path to be used for the server @@ -35,34 +36,35 @@ class MapServer(NumaflowServer): defaults to 4 and max capped at 16 Example Invocation: - from pynumaflow.mapper import Messages, Message, Datum, MapServer, Mapper - - class MessageForwarder(Mapper): - def handler(self, keys: list[str], datum: Datum) -> Messages: - val = datum.value - _ = datum.event_time - _ = datum.watermark - return Messages(Message(value=val, keys=keys)) + ```py + from pynumaflow.mapper import Messages, Message, Datum, MapServer, Mapper - def my_handler(keys: list[str], datum: Datum) -> Messages: + class MessageForwarder(Mapper): + def handler(self, keys: list[str], datum: Datum) -> Messages: val = datum.value _ = datum.event_time _ = datum.watermark return Messages(Message(value=val, keys=keys)) + def my_handler(keys: list[str], datum: Datum) -> Messages: + val = datum.value + _ = datum.event_time + _ = datum.watermark + return Messages(Message(value=val, keys=keys)) + - if __name__ == "__main__": - Use the class based approach or function based handler - based on the env variable - Both can be used and passed directly to the server class + if __name__ == "__main__": + # Use the class based approach or function based handler based on the env variable + # Both can be used and passed directly to the server class - invoke = os.getenv("INVOKE", "func_handler") - if invoke == "class": - handler = MessageForwarder() - else: - handler = my_handler - grpc_server = MapServer(handler) - grpc_server.start() + invoke = os.getenv("INVOKE", "func_handler") + if invoke == "class": + handler = MessageForwarder() + else: + handler = my_handler + grpc_server = MapServer(handler) + grpc_server.start() + ``` """ def __init__( @@ -73,17 +75,6 @@ def __init__( max_threads=NUM_THREADS_DEFAULT, server_info_file=MAP_SERVER_INFO_FILE_PATH, ): - """ - Create a new grpc Synchronous Map Server instance. - A new servicer instance is created and attached to the server. - The server instance is returned. - Args: - mapper_instance: The mapper instance to be used for Map UDF - sock_path: The UNIX socket path to be used for the server - max_message_size: The max message size in bytes the server can receive and send - max_threads: The max number of threads to be spawned; - defaults to 4 and max capped at 16 - """ self.sock_path = f"unix://{sock_path}" self.max_threads = min(max_threads, MAX_NUM_THREADS) self.max_message_size = max_message_size diff --git a/packages/pynumaflow/pynumaflow/mapstreamer/_dtypes.py b/packages/pynumaflow/pynumaflow/mapstreamer/_dtypes.py index a278ab38..e76d1bc2 100644 --- a/packages/pynumaflow/pynumaflow/mapstreamer/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/mapstreamer/_dtypes.py @@ -19,8 +19,8 @@ class Message: Args: value: data in bytes - keys: []string keys for vertex (optional) - tags: []string tags for conditional forwarding (optional) + keys: list of keys for vertex (optional) + tags: list of tags for conditional forwarding (optional) """ __slots__ = ("_value", "_keys", "_tags") @@ -29,7 +29,9 @@ class Message: _keys: list[str] _tags: list[str] - def __init__(self, value: bytes, keys: list[str] = None, tags: list[str] = None): + def __init__( + self, value: bytes, keys: Optional[list[str]] = None, tags: Optional[list[str]] = None + ): """ Creates a Message object to send value to a vertex. """ @@ -102,6 +104,7 @@ def items(self) -> list[Message]: class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. @@ -109,20 +112,19 @@ class Datum: watermark: the watermark of the event. headers: the headers of the event. - >>> # Example usage - >>> from pynumaflow.mapstreamer import Datum - >>> from datetime import datetime, timezone - >>> payload = bytes("test_mock_message", encoding="utf-8") - >>> t1 = datetime.fromtimestamp(1662998400, timezone.utc) - >>> t2 = datetime.fromtimestamp(1662998460, timezone.utc) - >>> msg_headers = {"key1": "value1", "key2": "value2"} - >>> d = Datum( - ... keys=["test_key"], - ... value=payload, - ... event_time=t1, - ... watermark=t2, - ... headers=msg_headers, - ... ) + Example: + ```py + from pynumaflow.mapstreamer import Datum + from datetime import datetime, timezone + + d = Datum( + keys=["test_key"], + value=b"test_mock_message", + event_time=datetime.fromtimestamp(1662998400, timezone.utc), + watermark=datetime.fromtimestamp(1662998460, timezone.utc), + headers={"key1": "value1", "key2": "value2"}, + ) + ``` """ __slots__ = ("_keys", "_value", "_event_time", "_watermark", "_headers") diff --git a/packages/pynumaflow/pynumaflow/mapstreamer/async_server.py b/packages/pynumaflow/pynumaflow/mapstreamer/async_server.py index 5d4bb80a..d718a6a5 100644 --- a/packages/pynumaflow/pynumaflow/mapstreamer/async_server.py +++ b/packages/pynumaflow/pynumaflow/mapstreamer/async_server.py @@ -42,6 +42,7 @@ def __init__( Create a new grpc Async Map Stream Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: map_stream_instance: The map stream instance to be used for Map Stream UDF sock_path: The UNIX socket path to be used for the server @@ -51,25 +52,13 @@ def __init__( server_type: The type of server to be used Example invocation: - import os - from collections.abc import AsyncIterable - from pynumaflow.mapstreamer import Message, Datum, MapStreamAsyncServer, MapStreamer - - class FlatMapStream(MapStreamer): - async def handler(self, keys: list[str], datum: Datum) -> AsyncIterable[Message]: - val = datum.value - _ = datum.event_time - _ = datum.watermark - strs = val.decode("utf-8").split(",") - - if len(strs) == 0: - yield Message.to_drop() - return - for s in strs: - yield Message(str.encode(s)) - - async def map_stream_handler(_: list[str], datum: Datum) -> AsyncIterable[Message]: + ```py + import os + from collections.abc import AsyncIterable + from pynumaflow.mapstreamer import Message, Datum, MapStreamAsyncServer, MapStreamer + class FlatMapStream(MapStreamer): + async def handler(self, keys: list[str], datum: Datum) -> AsyncIterable[Message]: val = datum.value _ = datum.event_time _ = datum.watermark @@ -81,15 +70,28 @@ async def map_stream_handler(_: list[str], datum: Datum) -> AsyncIterable[Messag for s in strs: yield Message(str.encode(s)) - if __name__ == "__main__": - invoke = os.getenv("INVOKE", "func_handler") - if invoke == "class": - handler = FlatMapStream() - else: - handler = map_stream_handler - grpc_server = MapStreamAsyncServer(handler) - grpc_server.start() - + async def map_stream_handler(_: list[str], datum: Datum) -> AsyncIterable[Message]: + + val = datum.value + _ = datum.event_time + _ = datum.watermark + strs = val.decode("utf-8").split(",") + + if len(strs) == 0: + yield Message.to_drop() + return + for s in strs: + yield Message(str.encode(s)) + + if __name__ == "__main__": + invoke = os.getenv("INVOKE", "func_handler") + if invoke == "class": + handler = FlatMapStream() + else: + handler = map_stream_handler + grpc_server = MapStreamAsyncServer(handler) + grpc_server.start() + ``` """ self.map_stream_instance: MapStreamCallable = map_stream_instance self.sock_path = f"unix://{sock_path}" diff --git a/packages/pynumaflow/pynumaflow/reducer/_dtypes.py b/packages/pynumaflow/pynumaflow/reducer/_dtypes.py index 6a70edb5..71a18142 100644 --- a/packages/pynumaflow/pynumaflow/reducer/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/reducer/_dtypes.py @@ -119,26 +119,27 @@ def items(self) -> list[Message]: class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. event_time: the event time of the event. watermark: the watermark of the event. headers: the headers of the event. - >>> # Example usage - >>> from pynumaflow.reducer import Datum - >>> from datetime import datetime, timezone - >>> payload = bytes("test_mock_message", encoding="utf-8") - >>> t1 = datetime.fromtimestamp(1662998400, timezone.utc) - >>> t2 = datetime.fromtimestamp(1662998460, timezone.utc) - >>> msg_headers = {"key1": "value1", "key2": "value2"} - >>> d = Datum( - ... keys=["test_key"], - ... value=payload, - ... event_time=t1, - ... watermark=t2, - ... headers=msg_headers - ... ) + + Example usage + ```py + from pynumaflow.reducer import Datum + from datetime import datetime, timezone + + d = Datum( + keys=["test_key"], + value=b"test_mock_message", + event_time=datetime.fromtimestamp(1662998400, timezone.utc), + watermark=datetime.fromtimestamp(1662998460, timezone.utc), + headers={"key1": "value1", "key2": "value2"}, + ) + ``` """ __slots__ = ("_keys", "_value", "_event_time", "_watermark", "_headers") diff --git a/packages/pynumaflow/pynumaflow/reducer/async_server.py b/packages/pynumaflow/pynumaflow/reducer/async_server.py index 2c2de147..4103fe98 100644 --- a/packages/pynumaflow/pynumaflow/reducer/async_server.py +++ b/packages/pynumaflow/pynumaflow/reducer/async_server.py @@ -57,6 +57,7 @@ class ReduceAsyncServer(NumaflowServer): Class for a new Reduce Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: reducer_instance: The reducer instance to be used for Reduce UDF sock_path: The UNIX socket path to be used for the server @@ -64,79 +65,67 @@ class ReduceAsyncServer(NumaflowServer): max_threads: The max number of threads to be spawned; defaults to 4 and max capped at 16 Example invocation: - import os - from collections.abc import AsyncIterable - from pynumaflow.reducer import Messages, Message, Datum, Metadata, - ReduceAsyncServer, Reducer - - class ReduceCounter(Reducer): - def __init__(self, counter): - self.counter = counter - - async def handler( - self, keys: list[str], datums: AsyncIterable[Datum], md: Metadata - ) -> Messages: - interval_window = md.interval_window - self.counter = 0 - async for _ in datums: - self.counter += 1 - msg = ( - f"counter:{self.counter} interval_window_start:{interval_window.start} " - f"interval_window_end:{interval_window.end}" - ) - return Messages(Message(str.encode(msg), keys=keys)) - - async def reduce_handler(keys: list[str], - datums: AsyncIterable[Datum], - md: Metadata) -> Messages: + ```py + import os + from collections.abc import AsyncIterable + from pynumaflow.reducer import Messages, Message, Datum, Metadata, + ReduceAsyncServer, Reducer + + class ReduceCounter(Reducer): + def __init__(self, counter): + self.counter = counter + + async def handler( + self, keys: list[str], datums: AsyncIterable[Datum], md: Metadata + ) -> Messages: interval_window = md.interval_window - counter = 0 + self.counter = 0 async for _ in datums: - counter += 1 + self.counter += 1 msg = ( - f"counter:{counter} interval_window_start:{interval_window.start} " + f"counter:{self.counter} interval_window_start:{interval_window.start} " f"interval_window_end:{interval_window.end}" ) return Messages(Message(str.encode(msg), keys=keys)) - if __name__ == "__main__": - invoke = os.getenv("INVOKE", "func_handler") - if invoke == "class": - # Here we are using the class instance as the reducer_instance - # which will be used to invoke the handler function. - # We are passing the init_args for the class instance. - grpc_server = ReduceAsyncServer(ReduceCounter, init_args=(0,)) - else: - # Here we are using the handler function directly as the reducer_instance. - grpc_server = ReduceAsyncServer(reduce_handler) - grpc_server.start() - + async def reduce_handler( + keys: list[str], datums: AsyncIterable[Datum], md: Metadata + ) -> Messages: + interval_window = md.interval_window + counter = 0 + async for _ in datums: + counter += 1 + msg = ( + f"counter:{counter} interval_window_start:{interval_window.start} " + f"interval_window_end:{interval_window.end}" + ) + return Messages(Message(str.encode(msg), keys=keys)) + + if __name__ == "__main__": + invoke = os.getenv("INVOKE", "func_handler") + if invoke == "class": + # Here we are using the class instance as the reducer_instance + # which will be used to invoke the handler function. + # We are passing the init_args for the class instance. + grpc_server = ReduceAsyncServer(ReduceCounter, init_args=(0,)) + else: + # Here we are using the handler function directly as the reducer_instance. + grpc_server = ReduceAsyncServer(reduce_handler) + grpc_server.start() + ``` """ def __init__( self, reducer_instance: ReduceCallable, init_args: tuple = (), - init_kwargs: dict = None, + init_kwargs: Optional[dict] = None, sock_path=REDUCE_SOCK_PATH, max_message_size=MAX_MESSAGE_SIZE, max_threads=NUM_THREADS_DEFAULT, server_info_file=REDUCE_SERVER_INFO_FILE_PATH, ): - """ - Create a new grpc Reduce Server instance. - A new servicer instance is created and attached to the server. - The server instance is returned. - Args: - reducer_instance: The reducer instance to be used for Reduce UDF - sock_path: The UNIX socket path to be used for the server - max_message_size: The max message size in bytes the server can receive and send - max_threads: The max number of threads to be spawned; - defaults to 4 and max capped at 16 - server_type: The type of server to be used - """ - if init_kwargs is None: - init_kwargs = {} + init_kwargs = init_kwargs or {} self.reducer_handler = get_handler(reducer_instance, init_args, init_kwargs) self.sock_path = f"unix://{sock_path}" self.max_message_size = max_message_size diff --git a/packages/pynumaflow/pynumaflow/reducestreamer/_dtypes.py b/packages/pynumaflow/pynumaflow/reducestreamer/_dtypes.py index 8628a6c0..2e0b4f97 100644 --- a/packages/pynumaflow/pynumaflow/reducestreamer/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/reducestreamer/_dtypes.py @@ -26,25 +26,27 @@ class WindowOperation(IntEnum): class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. event_time: the event time of the event. watermark: the watermark of the event. - >>> # Example usage - >>> from pynumaflow.reducer import Datum - >>> from datetime import datetime, timezone - >>> payload = bytes("test_mock_message", encoding="utf-8") - >>> t1 = datetime.fromtimestamp(1662998400, timezone.utc) - >>> t2 = datetime.fromtimestamp(1662998460, timezone.utc) - >>> msg_headers = {"key1": "value1", "key2": "value2"} - >>> d = Datum( - ... keys=["test_key"], - ... value=payload, - ... event_time=t1, - ... watermark=t2, - ... headers=msg_headers - ... ) + + Example usage + + ```py + from pynumaflow.reducer import Datum + from datetime import datetime, timezone + + d = Datum( + keys=["test_key"], + value="test_mock_message".encode(), + event_time=datetime.fromtimestamp(1662998400, timezone.utc), + watermark=datetime.fromtimestamp(1662998460, timezone.utc), + headers={"key1": "value1", "key2": "value2"} + ) + ``` """ __slots__ = ("_keys", "_value", "_event_time", "_watermark", "_headers") diff --git a/packages/pynumaflow/pynumaflow/reducestreamer/async_server.py b/packages/pynumaflow/pynumaflow/reducestreamer/async_server.py index f899e8e6..f974c1a0 100644 --- a/packages/pynumaflow/pynumaflow/reducestreamer/async_server.py +++ b/packages/pynumaflow/pynumaflow/reducestreamer/async_server.py @@ -59,6 +59,7 @@ class ReduceStreamAsyncServer(NumaflowServer): Class for a new Reduce Stream Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: reduce_stream_instance: The reducer instance to be used for Reduce Streaming UDF @@ -70,90 +71,75 @@ class ReduceStreamAsyncServer(NumaflowServer): max_threads: The max number of threads to be spawned; defaults to 4 and max capped at 16 server_info_file: The path to the server info file - Example invocation: - import os - from collections.abc import AsyncIterable - from pynumaflow.reducestreamer import Messages, Message, Datum, Metadata, - ReduceStreamAsyncServer, ReduceStreamer - - class ReduceCounter(ReduceStreamer): - def __init__(self, counter): - self.counter = counter - - async def handler( - self, - keys: list[str], - datums: AsyncIterable[Datum], - output: NonBlockingIterator, - md: Metadata, - ): - async for _ in datums: - self.counter += 1 - if self.counter > 20: - msg = f"counter:{self.counter}" - await output.put(Message(str.encode(msg), keys=keys)) - self.counter = 0 - msg = f"counter:{self.counter}" - await output.put(Message(str.encode(msg), keys=keys)) - async def reduce_handler( - keys: list[str], - datums: AsyncIterable[Datum], - output: NonBlockingIterator, - md: Metadata, - ): - counter = 0 + Example invocation: + ```py + import os + from collections.abc import AsyncIterable + from pynumaflow.reducestreamer import Messages, Message, Datum, Metadata, + ReduceStreamAsyncServer, ReduceStreamer + + class ReduceCounter(ReduceStreamer): + def __init__(self, counter): + self.counter = counter + + async def handler( + self, + keys: list[str], + datums: AsyncIterable[Datum], + output: NonBlockingIterator, + md: Metadata, + ): async for _ in datums: - counter += 1 - if counter > 20: - msg = f"counter:{counter}" + self.counter += 1 + if self.counter > 20: + msg = f"counter:{self.counter}" await output.put(Message(str.encode(msg), keys=keys)) - counter = 0 - msg = f"counter:{counter}" + self.counter = 0 + msg = f"counter:{self.counter}" await output.put(Message(str.encode(msg), keys=keys)) - if __name__ == "__main__": - invoke = os.getenv("INVOKE", "func_handler") - if invoke == "class": - # Here we are using the class instance as the reducer_instance - # which will be used to invoke the handler function. - # We are passing the init_args for the class instance. - grpc_server = ReduceStreamAsyncServer(ReduceCounter, init_args=(0,)) - else: - # Here we are using the handler function directly as the reducer_instance. - grpc_server = ReduceStreamAsyncServer(reduce_handler) - grpc_server.start() - + async def reduce_handler( + keys: list[str], + datums: AsyncIterable[Datum], + output: NonBlockingIterator, + md: Metadata, + ): + counter = 0 + async for _ in datums: + counter += 1 + if counter > 20: + msg = f"counter:{counter}" + await output.put(Message(str.encode(msg), keys=keys)) + counter = 0 + msg = f"counter:{counter}" + await output.put(Message(str.encode(msg), keys=keys)) + + if __name__ == "__main__": + invoke = os.getenv("INVOKE", "func_handler") + if invoke == "class": + # Here we are using the class instance as the reducer_instance + # which will be used to invoke the handler function. + # We are passing the init_args for the class instance. + grpc_server = ReduceStreamAsyncServer(ReduceCounter, init_args=(0,)) + else: + # Here we are using the handler function directly as the reducer_instance. + grpc_server = ReduceStreamAsyncServer(reduce_handler) + grpc_server.start() + ``` """ def __init__( self, reduce_stream_instance: ReduceStreamCallable, init_args: tuple = (), - init_kwargs: dict = None, + init_kwargs: Optional[dict] = None, sock_path=REDUCE_STREAM_SOCK_PATH, max_message_size=MAX_MESSAGE_SIZE, max_threads=NUM_THREADS_DEFAULT, server_info_file=REDUCE_STREAM_SERVER_INFO_FILE_PATH, ): - """ - Create a new grpc Reduce Streamer Server instance. - A new servicer instance is created and attached to the server. - The server instance is returned. - Args: - reduce_stream_instance: The reducer instance to be used for - Reduce Streaming UDF - init_args: The arguments to be passed to the reduce_stream_handler - init_kwargs: The keyword arguments to be passed to the - reduce_stream_handler - sock_path: The UNIX socket path to be used for the server - max_message_size: The max message size in bytes the server can receive and send - max_threads: The max number of threads to be spawned; - defaults to 4 and max capped at 16 - server_info_file: The path to the server info file - """ - if init_kwargs is None: - init_kwargs = {} + init_kwargs = init_kwargs or {} self.reduce_stream_handler = get_handler(reduce_stream_instance, init_args, init_kwargs) self.sock_path = f"unix://{sock_path}" self.max_message_size = max_message_size diff --git a/packages/pynumaflow/pynumaflow/sideinput/_dtypes.py b/packages/pynumaflow/pynumaflow/sideinput/_dtypes.py index 6a68f420..9ec3a67c 100644 --- a/packages/pynumaflow/pynumaflow/sideinput/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/sideinput/_dtypes.py @@ -9,12 +9,18 @@ class Response: """ Class to define the important information for the event. + Args: value: the payload of the event. no_broadcast: the flag to indicate whether the event should be broadcasted. - >>> # Example usage - >>> Response.broadcast_message(b"hello") - >>> Response.no_broadcast_message() + + Example usage + ```py + from pynumaflow.sideinput import Response + + Response.broadcast_message(b"hello") + Response.no_broadcast_message() + ``` """ __slots__ = ("value", "no_broadcast") diff --git a/packages/pynumaflow/pynumaflow/sideinput/server.py b/packages/pynumaflow/pynumaflow/sideinput/server.py index 9e8a4d7c..7bb27b86 100644 --- a/packages/pynumaflow/pynumaflow/sideinput/server.py +++ b/packages/pynumaflow/pynumaflow/sideinput/server.py @@ -18,6 +18,7 @@ class SideInputServer(NumaflowServer): """ Class for a new Side Input Server instance. + Args: side_input_instance: The side input instance to be used for Side Input UDF sock_path: The UNIX socket path to be used for the server @@ -25,29 +26,30 @@ class SideInputServer(NumaflowServer): max_threads: The max number of threads to be spawned; Example invocation: - import datetime - from pynumaflow.sideinput import Response, SideInputServer, SideInput - - class ExampleSideInput(SideInput): - def __init__(self): - self.counter = 0 + ```py + import datetime + from pynumaflow.sideinput import Response, SideInputServer, SideInput - def retrieve_handler(self) -> Response: - time_now = datetime.datetime.now() - # val is the value to be broadcasted - val = f"an example: {str(time_now)}" - self.counter += 1 - # broadcast every other time - if self.counter % 2 == 0: - # no_broadcast_message() is used to indicate that there is no broadcast - return Response.no_broadcast_message() - # broadcast_message() is used to indicate that there is a broadcast - return Response.broadcast_message(val.encode("utf-8")) + class ExampleSideInput(SideInput): + def __init__(self): + self.counter = 0 - if __name__ == "__main__": - grpc_server = SideInputServer(ExampleSideInput()) - grpc_server.start() + def retrieve_handler(self) -> Response: + time_now = datetime.datetime.now() + # val is the value to be broadcasted + val = f"an example: {str(time_now)}" + self.counter += 1 + # broadcast every other time + if self.counter % 2 == 0: + # no_broadcast_message() is used to indicate that there is no broadcast + return Response.no_broadcast_message() + # broadcast_message() is used to indicate that there is a broadcast + return Response.broadcast_message(val.encode("utf-8")) + if __name__ == "__main__": + grpc_server = SideInputServer(ExampleSideInput()) + grpc_server.start() + ``` """ def __init__( diff --git a/packages/pynumaflow/pynumaflow/sinker/_dtypes.py b/packages/pynumaflow/pynumaflow/sinker/_dtypes.py index c2582ca2..174cb9ee 100644 --- a/packages/pynumaflow/pynumaflow/sinker/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/sinker/_dtypes.py @@ -18,9 +18,9 @@ class Message: Basic datatype for OnSuccess UDSink message. Args: - keys: the keys of the on_success message. - value: the payload of the on_success message. - user_metadata: the user metadata of the on_success message. + keys: list of keys for the on_success message. + value: payload of the on_success message. + user_metadata: user metadata of the on_success message. """ _keys: Optional[list[str]] @@ -174,29 +174,28 @@ def items(self) -> list[R]: class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. event_time: the event time of the event. watermark: the watermark of the event. headers: the headers of the event. - >>> # Example usage - >>> from pynumaflow.sinker import Datum - >>> from datetime import datetime, timezone - >>> payload = bytes("test_mock_message", encoding="utf-8") - >>> t1 = datetime.fromtimestamp(1662998400, timezone.utc) - >>> t2 = datetime.fromtimestamp(1662998460, timezone.utc) - >>> msg_headers = {"key1": "value1", "key2": "value2"} - >>> msg_id = "test_id" - >>> output_keys = ["test_key"] - >>> d = Datum( - ... keys=output_keys, - ... sink_msg_id=msg_id, - ... value=payload, - ... event_time=t1, - ... watermark=t2, - ... headers=msg_headers - ... ) + + Example usage + ```py + from pynumaflow.sinker import Datum + from datetime import datetime, timezone + + d = Datum( + keys=["test_key"], + sink_msg_id="test_id", + value=b"test_mock_message", + event_time=datetime.fromtimestamp(1662998400, timezone.utc), + watermark=datetime.fromtimestamp(1662998460, timezone.utc), + headers={"key1": "value1", "key2": "value2"}, + ) + ``` """ __slots__ = ( diff --git a/packages/pynumaflow/pynumaflow/sinker/async_server.py b/packages/pynumaflow/pynumaflow/sinker/async_server.py index a9331aca..40020ced 100644 --- a/packages/pynumaflow/pynumaflow/sinker/async_server.py +++ b/packages/pynumaflow/pynumaflow/sinker/async_server.py @@ -34,6 +34,7 @@ class SinkAsyncServer(NumaflowServer): Create a new grpc Async Sink Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: sinker_instance: The sinker instance to be used for Sink UDF sock_path: The UNIX socket path to be used for the server @@ -42,38 +43,42 @@ class SinkAsyncServer(NumaflowServer): defaults to 4 and max capped at 16 Example invocation: - import os - from collections.abc import AsyncIterable - from pynumaflow.sinker import Datum, Responses, Response, Sinker - from pynumaflow.sinker import SinkAsyncServer - from pynumaflow._constants import _LOGGER - - - class UserDefinedSink(Sinker): - async def handler(self, datums: AsyncIterable[Datum]) -> Responses: - responses = Responses() - async for msg in datums: - _LOGGER.info("User Defined Sink %s", msg.value.decode("utf-8")) - responses.append(Response.as_success(msg.id)) - return responses - - - async def udsink_handler(datums: AsyncIterable[Datum]) -> Responses: + ```py + import os + import logging + from collections.abc import AsyncIterable + from pynumaflow.sinker import Datum, Responses, Response, Sinker + from pynumaflow.sinker import SinkAsyncServer + from pynumaflow._constants import _LOGGER + + logging.basicConfig(level=logging.INFO) + + class UserDefinedSink(Sinker): + async def handler(self, datums: AsyncIterable[Datum]) -> Responses: responses = Responses() async for msg in datums: - _LOGGER.info("User Defined Sink %s", msg.value.decode("utf-8")) + logging.info("User Defined Sink %s", msg.value.decode("utf-8")) responses.append(Response.as_success(msg.id)) return responses - if __name__ == "__main__": - invoke = os.getenv("INVOKE", "func_handler") - if invoke == "class": - sink_handler = UserDefinedSink() - else: - sink_handler = udsink_handler - grpc_server = SinkAsyncServer(sink_handler) - grpc_server.start() + async def udsink_handler(datums: AsyncIterable[Datum]) -> Responses: + responses = Responses() + async for msg in datums: + logging.info("User Defined Sink %s", msg.value.decode("utf-8")) + responses.append(Response.as_success(msg.id)) + return responses + + + if __name__ == "__main__": + invoke = os.getenv("INVOKE", "func_handler") + if invoke == "class": + sink_handler = UserDefinedSink() + else: + sink_handler = udsink_handler + grpc_server = SinkAsyncServer(sink_handler) + grpc_server.start() + ``` """ def __init__( diff --git a/packages/pynumaflow/pynumaflow/sinker/server.py b/packages/pynumaflow/pynumaflow/sinker/server.py index 842c1725..378dff12 100644 --- a/packages/pynumaflow/pynumaflow/sinker/server.py +++ b/packages/pynumaflow/pynumaflow/sinker/server.py @@ -41,6 +41,7 @@ def __init__( Create a new grpc Sink Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: sinker_instance: The sinker instance to be used for Sink UDF sock_path: The UNIX socket path to be used for the server @@ -48,37 +49,41 @@ def __init__( max_threads: The max number of threads to be spawned; defaults to 4 and max capped at 16 Example invocation: - import os - from collections.abc import Iterator - - from pynumaflow.sinker import Datum, Responses, Response, SinkServer - from pynumaflow.sinker import Sinker - from pynumaflow._constants import _LOGGER - - class UserDefinedSink(Sinker): - def handler(self, datums: Iterator[Datum]) -> Responses: - responses = Responses() - for msg in datums: - _LOGGER.info("User Defined Sink %s", msg.value.decode("utf-8")) - responses.append(Response.as_success(msg.id)) - return responses - - def udsink_handler(datums: Iterator[Datum]) -> Responses: + ```py + import os + import logging + from collections.abc import Iterator + + from pynumaflow.sinker import Datum, Responses, Response, SinkServer + from pynumaflow.sinker import Sinker + from pynumaflow._constants import _LOGGER + + logging.basicConfig(level=logging.INFO) + + class UserDefinedSink(Sinker): + def handler(self, datums: Iterator[Datum]) -> Responses: responses = Responses() for msg in datums: - _LOGGER.info("User Defined Sink %s", msg.value.decode("utf-8")) + logging.info("User Defined Sink %s", msg.value.decode("utf-8")) responses.append(Response.as_success(msg.id)) return responses - if __name__ == "__main__": - invoke = os.getenv("INVOKE", "func_handler") - if invoke == "class": - sink_handler = UserDefinedSink() - else: - sink_handler = udsink_handler - grpc_server = SinkServer(sink_handler) - grpc_server.start() + def udsink_handler(datums: Iterator[Datum]) -> Responses: + responses = Responses() + for msg in datums: + logging.info("User Defined Sink %s", msg.value.decode("utf-8")) + responses.append(Response.as_success(msg.id)) + return responses + if __name__ == "__main__": + invoke = os.getenv("INVOKE", "func_handler") + if invoke == "class": + sink_handler = UserDefinedSink() + else: + sink_handler = udsink_handler + grpc_server = SinkServer(sink_handler) + grpc_server.start() + ``` """ # If the container type is fallback sink, then use the fallback sink address and path. if os.getenv(ENV_UD_CONTAINER_TYPE, "") == UD_CONTAINER_FALLBACK_SINK: diff --git a/packages/pynumaflow/pynumaflow/sourcer/_dtypes.py b/packages/pynumaflow/pynumaflow/sourcer/_dtypes.py index faae8692..edf41e39 100644 --- a/packages/pynumaflow/pynumaflow/sourcer/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/sourcer/_dtypes.py @@ -55,7 +55,7 @@ class Message: payload: data in bytes offset: the offset of the datum. event_time: event time of the message, usually extracted from the payload. - keys: []string keys for vertex (optional) + keys: list of string keys for the vertex (optional) headers: dict of headers for the message (optional) user_metadata: metadata for the message (optional) """ @@ -74,7 +74,7 @@ def __init__( payload: bytes, offset: Offset, event_time: datetime, - keys: list[str] = None, + keys: Optional[list[str]] = None, headers: Optional[dict[str, str]] = None, user_metadata: Optional[UserMetadata] = None, ): @@ -118,12 +118,16 @@ def user_metadata(self) -> UserMetadata: class ReadRequest: """ Class to define the request for reading datum stream from user defined source. + Args: num_records: the number of records to read. timeout_in_ms: the request timeout in milliseconds. - >>> # Example usage - >>> from pynumaflow.sourcer import ReadRequest - >>> read_request = ReadRequest(num_records=10, timeout_in_ms=1000) + + Example: + ```py + from pynumaflow.sourcer import ReadRequest + read_request = ReadRequest(num_records=10, timeout_in_ms=1000) + ``` """ __slots__ = ("_num_records", "_timeout_in_ms") @@ -159,12 +163,17 @@ class AckRequest: """ Class for defining the request for acknowledging datum. It takes a list of offsets that need to be acknowledged. + Args: offsets: the offsets to be acknowledged. - >>> # Example usage - >>> from pynumaflow.sourcer import AckRequest, Offset - >>> offset_val = Offset(offset=b"123", partition_id=0) - >>> ack_request = AckRequest(offsets=[offset_val, offset_val]) + + Example: + ```py + from pynumaflow.sourcer import AckRequest, Offset + + offset_val = Offset(offset=b"123", partition_id=0) + ack_request = AckRequest(offsets=[offset_val, offset_val]) + ``` """ __slots__ = ("_offsets",) @@ -184,12 +193,16 @@ class NackRequest: """ Class for defining the request for negatively acknowledging an offset. It takes a list of offsets that need to be negatively acknowledged on the source. + Args: offsets: the offsets to be negatively acknowledged. - >>> # Example usage - >>> from pynumaflow.sourcer import NackRequest, Offset - >>> offset_val = Offset(offset=b"123", partition_id=0) - >>> nack_request = NackRequest(offsets=[offset_val, offset_val]) + + Example: + ```py + from pynumaflow.sourcer import NackRequest, Offset + offset_val = Offset(offset=b"123", partition_id=0) + nack_request = NackRequest(offsets=[offset_val, offset_val]) + ``` """ __slots__ = ("_offsets",) @@ -210,6 +223,7 @@ class PendingResponse: PendingResponse is the response for the pending request. It indicates the number of pending records at the user defined source. A negative count indicates that the pending information is not available. + Args: count: the number of pending records. """ @@ -234,6 +248,7 @@ class PartitionsResponse: PartitionsResponse is the response for the partition request. It indicates the number of partitions at the user defined source. A negative count indicates that the partition information is not available. + Args: count: the number of partitions. """ @@ -256,9 +271,6 @@ class Sourcer(metaclass=ABCMeta): """ Provides an interface to write a Sourcer which will be exposed over an gRPC server. - - Args: - """ def __call__(self, *args, **kwargs): diff --git a/packages/pynumaflow/pynumaflow/sourcer/async_server.py b/packages/pynumaflow/pynumaflow/sourcer/async_server.py index 264558b9..7e312213 100644 --- a/packages/pynumaflow/pynumaflow/sourcer/async_server.py +++ b/packages/pynumaflow/pynumaflow/sourcer/async_server.py @@ -34,6 +34,7 @@ def __init__( Create a new grpc Async Source Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: sourcer_instance: The sourcer instance to be used for Source UDF sock_path: The UNIX socket path to be used for the server @@ -42,95 +43,96 @@ def __init__( defaults to 4 and max capped at 16 Example invocation: - from datetime import datetime - from pynumaflow.shared.asynciter import NonBlockingIterator - from pynumaflow.sourcer import ( - ReadRequest, - Message, - AckRequest, - PendingResponse, - Offset, - PartitionsResponse, - get_default_partitions, - Sourcer, - SourceAsyncServer, - NackRequest, - ) - - class AsyncSource(Sourcer): - # AsyncSource is a class for User Defined Source implementation. - - def __init__(self): - # The offset idx till where the messages have been read - self.read_idx: int = 0 - # Set to maintain a track of the offsets yet to be acknowledged - self.to_ack_set: set[int] = set() - # Set to maintain a track of the offsets that have been negatively acknowledged - self.nacked: set[int] = set() - - async def read_handler(self, datum: ReadRequest, output: NonBlockingIterator): - ''' - read_handler is used to read the data from the source and send the data forward - for each read request we process num_records and increment the read_idx to - indicate that the message has been read and the same is added to the ack set - ''' - if self.to_ack_set: - return - - for x in range(datum.num_records): - # If there are any nacked offsets, re-deliver them - if self.nacked: - idx = self.nacked.pop() - else: - idx = self.read_idx - self.read_idx += 1 - headers = {"x-txn-id": str(uuid.uuid4())} - await output.put( - Message( - payload=str(self.read_idx).encode(), - offset=Offset.offset_with_default_partition_id(str(idx).encode()), - event_time=datetime.now(), - headers=headers, - ) - ) - self.to_ack_set.add(idx) - - async def ack_handler(self, ack_request: AckRequest): - ''' - The ack handler is used acknowledge the offsets that have been read, and remove - them from the to_ack_set - ''' - for req in ack_request.offsets: - offset = int(req.offset) - self.to_ack_set.remove(offset) - - async def nack_handler(self, ack_request: NackRequest): - ''' - Add the offsets that have been negatively acknowledged to the nacked set - ''' - for req in ack_request.offsets: - offset = int(req.offset) - self.to_ack_set.remove(offset) - self.nacked.add(offset) - - async def pending_handler(self) -> PendingResponse: - ''' - The simple source always returns zero to indicate there is no pending record. - ''' - return PendingResponse(count=0) - - async def partitions_handler(self) -> PartitionsResponse: - ''' - The simple source always returns default partitions. - ''' - return PartitionsResponse(partitions=get_default_partitions()) - - - if __name__ == "__main__": - ud_source = AsyncSource() - grpc_server = SourceAsyncServer(ud_source) - grpc_server.start() + ```py + from datetime import datetime + from pynumaflow.shared.asynciter import NonBlockingIterator + from pynumaflow.sourcer import ( + ReadRequest, + Message, + AckRequest, + PendingResponse, + Offset, + PartitionsResponse, + get_default_partitions, + Sourcer, + SourceAsyncServer, + NackRequest, + ) + class AsyncSource(Sourcer): + # AsyncSource is a class for User Defined Source implementation. + + def __init__(self): + # The offset idx till where the messages have been read + self.read_idx: int = 0 + # Set to maintain a track of the offsets yet to be acknowledged + self.to_ack_set: set[int] = set() + # Set to maintain a track of the offsets that have been negatively acknowledged + self.nacked: set[int] = set() + + async def read_handler(self, datum: ReadRequest, output: NonBlockingIterator): + ''' + read_handler is used to read the data from the source and send the data forward + for each read request we process num_records and increment the read_idx to + indicate that the message has been read and the same is added to the ack set + ''' + if self.to_ack_set: + return + + for x in range(datum.num_records): + # If there are any nacked offsets, re-deliver them + if self.nacked: + idx = self.nacked.pop() + else: + idx = self.read_idx + self.read_idx += 1 + headers = {"x-txn-id": str(uuid.uuid4())} + await output.put( + Message( + payload=str(self.read_idx).encode(), + offset=Offset.offset_with_default_partition_id(str(idx).encode()), + event_time=datetime.now(), + headers=headers, + ) + ) + self.to_ack_set.add(idx) + + async def ack_handler(self, ack_request: AckRequest): + ''' + The ack handler is used acknowledge the offsets that have been read, and remove + them from the to_ack_set + ''' + for req in ack_request.offsets: + offset = int(req.offset) + self.to_ack_set.remove(offset) + + async def nack_handler(self, ack_request: NackRequest): + ''' + Add the offsets that have been negatively acknowledged to the nacked set + ''' + for req in ack_request.offsets: + offset = int(req.offset) + self.to_ack_set.remove(offset) + self.nacked.add(offset) + + async def pending_handler(self) -> PendingResponse: + ''' + The simple source always returns zero to indicate there is no pending record. + ''' + return PendingResponse(count=0) + + async def partitions_handler(self) -> PartitionsResponse: + ''' + The simple source always returns default partitions. + ''' + return PartitionsResponse(partitions=get_default_partitions()) + + + if __name__ == "__main__": + ud_source = AsyncSource() + grpc_server = SourceAsyncServer(ud_source) + grpc_server.start() + ``` """ self.sock_path = f"unix://{sock_path}" self.max_threads = min(max_threads, MAX_NUM_THREADS) diff --git a/packages/pynumaflow/pynumaflow/sourcetransformer/_dtypes.py b/packages/pynumaflow/pynumaflow/sourcetransformer/_dtypes.py index 4d3da5ad..28591000 100644 --- a/packages/pynumaflow/pynumaflow/sourcetransformer/_dtypes.py +++ b/packages/pynumaflow/pynumaflow/sourcetransformer/_dtypes.py @@ -112,26 +112,27 @@ def items(self) -> list[Message]: class Datum: """ Class to define the important information for the event. + Args: keys: the keys of the event. value: the payload of the event. event_time: the event time of the event. watermark: the watermark of the event. headers: the headers of the event. - >>> # Example usage - >>> from pynumaflow.sourcetransformer import Datum - >>> from datetime import datetime, timezone - >>> payload = bytes("test_mock_message", encoding="utf-8") - >>> t1 = datetime.fromtimestamp(1662998400, timezone.utc) - >>> msg_headers = {"key1": "value1", "key2": "value2"} - >>> t2 = datetime.fromtimestamp(1662998460, timezone.utc) - >>> d = Datum( - ... keys=["test_key"], - ... value=payload, - ... event_time=t1, - ... watermark=t2, - ... headers=msg_headers, - ... ) + + Example: + ```py + from pynumaflow.sourcetransformer import Datum + from datetime import datetime, timezone + + d = Datum( + keys=["test_key"], + value=b"test_mock_message", + event_time=datetime.fromtimestamp(1662998400, timezone.utc), + watermark=datetime.fromtimestamp(1662998460, timezone.utc), + headers={"key1": "value1", "key2": "value2"}, + ) + ``` """ __slots__ = ("_keys", "_value", "_event_time", "_watermark", "_headers") diff --git a/packages/pynumaflow/pynumaflow/sourcetransformer/async_server.py b/packages/pynumaflow/pynumaflow/sourcetransformer/async_server.py index 0dc8add9..520fe281 100644 --- a/packages/pynumaflow/pynumaflow/sourcetransformer/async_server.py +++ b/packages/pynumaflow/pynumaflow/sourcetransformer/async_server.py @@ -27,6 +27,7 @@ class SourceTransformAsyncServer(NumaflowServer): Create a new grpc Source Transformer Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: source_transform_instance: The source transformer instance to be used for Source Transformer UDF @@ -35,21 +36,20 @@ class SourceTransformAsyncServer(NumaflowServer): max_threads: The max number of threads to be spawned; defaults to 4 and max capped at 16 - Example Invocation: + Below is a simple User Defined Function example which receives a message, applies the + following data transformation, and returns the message. + + - If the message event time is before year 2022, drop the message with event time unchanged. + - If it's within year 2022, update the tag to `within_year_2022` and update the message + event time to Jan 1st 2022. + - Otherwise, (exclusively after year 2022), update the tag to `after_year_2022` and update + the message event time to Jan 1st 2023. + + ```py import datetime import logging - from pynumaflow.sourcetransformer import Messages, Message, Datum, SourceTransformServer - # This is a simple User Defined Function example which receives a message, - # applies the following - # data transformation, and returns the message. - # If the message event time is before year 2022, drop the message with event time unchanged. - # If it's within year 2022, update the tag to "within_year_2022" and - # update the message event time to Jan 1st 2022. - # Otherwise, (exclusively after year 2022), update the tag to - # "after_year_2022" and update the - # message event time to Jan 1st 2023. january_first_2022 = datetime.datetime.fromtimestamp(1640995200) january_first_2023 = datetime.datetime.fromtimestamp(1672531200) @@ -86,6 +86,7 @@ async def my_handler(keys: list[str], datum: Datum) -> Messages: if __name__ == "__main__": grpc_server = SourceTransformAsyncServer(my_handler) grpc_server.start() + ``` """ def __init__( @@ -96,17 +97,6 @@ def __init__( max_threads=NUM_THREADS_DEFAULT, server_info_file=SOURCE_TRANSFORMER_SERVER_INFO_FILE_PATH, ): - """ - Create a new grpc Asynchronous Map Server instance. - A new servicer instance is created and attached to the server. - The server instance is returned. - Args: - mapper_instance: The mapper instance to be used for Map UDF - sock_path: The UNIX socket path to be used for the server - max_message_size: The max message size in bytes the server can receive and send - max_threads: The max number of threads to be spawned; - defaults to 4 and max capped at 16 - """ self.sock_path = f"unix://{sock_path}" self.max_threads = min(max_threads, MAX_NUM_THREADS) self.max_message_size = max_message_size diff --git a/packages/pynumaflow/pynumaflow/sourcetransformer/multiproc_server.py b/packages/pynumaflow/pynumaflow/sourcetransformer/multiproc_server.py index b4dd87bc..dbc8b7b5 100644 --- a/packages/pynumaflow/pynumaflow/sourcetransformer/multiproc_server.py +++ b/packages/pynumaflow/pynumaflow/sourcetransformer/multiproc_server.py @@ -36,6 +36,7 @@ def __init__( Create a new grpc Source Transformer Multiproc Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: source_transform_instance: The source transformer instance to be used for Source Transformer UDF @@ -46,57 +47,67 @@ def __init__( defaults to 4 and max capped at 16 Example invocation: - import datetime - import logging - - from pynumaflow.sourcetransformer import Messages, Message, Datum, SourceTransformServer - - # This is a simple User Defined Function example which receives a message, - # applies the following - # data transformation, and returns the message. - # If the message event time is before year 2022, drop the message - # with event time unchanged. - # If it's within year 2022, update the tag to "within_year_2022" and - # update the message event time to Jan 1st 2022. - # Otherwise, (exclusively after year 2022), update the tag to - # "after_year_2022" and update the - - - january_first_2022 = datetime.datetime.fromtimestamp(1640995200) - january_first_2023 = datetime.datetime.fromtimestamp(1672531200) - - - def my_handler(keys: list[str], datum: Datum) -> Messages: - val = datum.value - event_time = datum.event_time - messages = Messages() - - if event_time < january_first_2022: - logging.info("Got event time:%s, it is before 2022, so dropping", event_time) - messages.append(Message.to_drop(event_time)) - elif event_time < january_first_2023: - logging.info( - "Got event time:%s, it is within year 2022, so - forwarding to within_year_2022", - event_time, - ) - messages.append( - Message(value=val, event_time=january_first_2022, tags=["within_year_2022"]) - ) - else: - logging.info( - "Got event time:%s, it is after year 2022, so forwarding to - after_year_2022", event_time - ) - messages.append(Message(value=val, event_time=january_first_2023, - tags=["after_year_2022"])) - - return messages - - if __name__ == "__main__": - grpc_server = SourceTransformMultiProcServer(source_transform_instance=my_handler - ,server_count = 2) - grpc_server.start() + ```py + import datetime + import logging + + from pynumaflow.sourcetransformer import Messages, Message, Datum, SourceTransformServer + + # This is a simple User Defined Function example which receives a message, + # applies the following data transformation, and returns the message. + # If the message event time is before year 2022, drop the message + # with event time unchanged. + # If it's within year 2022, update the tag to "within_year_2022" and + # update the message event time to Jan 1st 2022. + # Otherwise, (exclusively after year 2022), update the tag to + # "after_year_2022" and update the + + + january_first_2022 = datetime.datetime.fromtimestamp(1640995200) + january_first_2023 = datetime.datetime.fromtimestamp(1672531200) + + + def my_handler(keys: list[str], datum: Datum) -> Messages: + val = datum.value + event_time = datum.event_time + messages = Messages() + + if event_time < january_first_2022: + logging.info("Got event time:%s, it is before 2022, so dropping", event_time) + messages.append(Message.to_drop(event_time)) + elif event_time < january_first_2023: + logging.info( + "Got event time:%s, it is within year 2022, so + forwarding to within_year_2022", + event_time, + ) + message = Message( + value=val, + event_time=january_first_2022, + tags=["within_year_2022"], + ) + messages.append(message) + else: + logging.info( + "Got event time:%s, it is after year 2022, so forwarding to after_year_2022", + event_time, + ) + message = Message( + value=val, + event_time=january_first_2023, + tags=["after_year_2022"], + ) + messages.append(message) + + return messages + + if __name__ == "__main__": + grpc_server = SourceTransformMultiProcServer( + source_transform_instance=my_handler, + server_count=2, + ) + grpc_server.start() + ``` """ self.sock_path = f"unix://{sock_path}" self.max_threads = min(max_threads, MAX_NUM_THREADS) @@ -120,9 +131,8 @@ def my_handler(keys: list[str], datum: Datum) -> Messages: def start(self): """ - Starts the N grpc servers gRPC serves on the with - given max threads. - where N = The number of CPUs or the value of the parameter server_count + Starts the N gRPC servers on the given socket path with given max threads. + Here N = The number of CPUs or the value of the parameter `server_count` defined by the user. The max value is capped to 2 * CPU count. """ diff --git a/packages/pynumaflow/pynumaflow/sourcetransformer/server.py b/packages/pynumaflow/pynumaflow/sourcetransformer/server.py index 58735a27..7069e2b6 100644 --- a/packages/pynumaflow/pynumaflow/sourcetransformer/server.py +++ b/packages/pynumaflow/pynumaflow/sourcetransformer/server.py @@ -31,6 +31,7 @@ def __init__( Create a new grpc Source Transformer Server instance. A new servicer instance is created and attached to the server. The server instance is returned. + Args: source_transform_instance: The source transformer instance to be used for Source Transformer UDF @@ -39,21 +40,21 @@ def __init__( max_threads: The max number of threads to be spawned; defaults to 4 and max capped at 16 - Example Invocation: + Below is a simple User Defined Function example which receives a message, applies the + following data transformation, and returns the message. + + - If the message event time is before year 2022, drop the message with event time unchanged. + - If it's within year 2022, update the tag to `within_year_2022` and update the message + event time to Jan 1st 2022. + - Otherwise, (exclusively after year 2022), update the tag to `after_year_2022` and update + the message event time to Jan 1st 2023. + ```py import datetime import logging from pynumaflow.sourcetransformer import Messages, Message, Datum, SourceTransformServer - # This is a simple User Defined Function example which receives a message, - # applies the following - # data transformation, and returns the message. - # If the message event time is before year 2022, drop the message with event time unchanged. - # If it's within year 2022, update the tag to "within_year_2022" and - # update the message event time to Jan 1st 2022. - # Otherwise, (exclusively after year 2022), update the tag to - # "after_year_2022" and update the - # message event time to Jan 1st 2023. + january_first_2022 = datetime.datetime.fromtimestamp(1640995200) january_first_2023 = datetime.datetime.fromtimestamp(1672531200) @@ -90,6 +91,7 @@ def my_handler(keys: list[str], datum: Datum) -> Messages: if __name__ == "__main__": grpc_server = SourceTransformServer(my_handler) grpc_server.start() + ``` """ self.sock_path = f"unix://{sock_path}" self.max_threads = min(max_threads, MAX_NUM_THREADS) diff --git a/packages/pynumaflow/pyproject.toml b/packages/pynumaflow/pyproject.toml index 0adffa2d..90a63bd7 100644 --- a/packages/pynumaflow/pyproject.toml +++ b/packages/pynumaflow/pyproject.toml @@ -48,6 +48,16 @@ mypy = "^1.18.2" grpc-stubs = "^1.53.0.6" types-psutil = "^7.0.0.20251001" +[tool.poetry.group.docs] +optional = true + +[tool.poetry.group.docs.dependencies] +mkdocs = "^1.6" +mkdocs-material = {version = "^9.5", extras = ["imaging"]} +mkdocs-exclude = "^1.0" +mkdocstrings = {version = "^0.27", extras = ["python"]} +mike = "^2.1" + [build-system] requires = ["poetry-core>=1.0.0"] build-backend = "poetry.core.masonry.api"