-
Notifications
You must be signed in to change notification settings - Fork 20
Andystaples/add functions support #75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This draft PR introduces Azure Functions support for the durabletask-python library by creating a new durabletask-azurefunctions package. This allows developers to use Durable Task patterns within Azure Functions using Python decorators and bindings that integrate with the Azure Functions worker.
Key Changes:
- Added a new
durabletask-azurefunctionspackage with decorators, client, and worker implementations for Azure Functions integration - Modified core
durabletask/worker.pyto support a newProtoTaskHubSidecarServiceStubtype alongside the existing gRPC stub - Introduced a base
ProtoTaskHubSidecarServiceStubclass that can be extended for different communication patterns
Reviewed changes
Copilot reviewed 12 out of 15 changed files in this pull request and generated 22 comments.
Show a summary per file
| File | Description |
|---|---|
durabletask/worker.py |
Added import for new stub type, updated type hints to accept Union of stub types, added handling for orchestratorCompleted event |
durabletask/internal/ProtoTaskHubSidecarServiceStub.py |
New base stub class defining the protocol interface with callable attributes for all Task Hub operations |
durabletask-azurefunctions/pyproject.toml |
Package configuration for the new Azure Functions integration package with dependencies |
durabletask-azurefunctions/durabletask/azurefunctions/worker.py |
Worker implementation that extends TaskHubGrpcWorker without async worker loop for Functions execution model |
durabletask-azurefunctions/durabletask/azurefunctions/client.py |
Client implementation for Azure Functions that parses connection info from JSON and uses custom interceptors |
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_null_stub.py |
Null stub implementation that provides no-op lambdas for all stub operations |
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_grpc_interceptor.py |
Custom gRPC interceptor that adds Azure Functions-specific headers |
durabletask-azurefunctions/durabletask/azurefunctions/decorators/metadata.py |
Trigger and binding metadata classes for orchestration, activity, entity, and client bindings |
durabletask-azurefunctions/durabletask/azurefunctions/decorators/durable_app.py |
Blueprint and DFApp classes providing decorators for registering Functions with Durable Task patterns |
durabletask-azurefunctions/durabletask/azurefunctions/decorators/__init__.py |
Package exports for decorator module |
durabletask-azurefunctions/durabletask/azurefunctions/constants.py |
Constants for trigger and binding type names |
durabletask-azurefunctions/CHANGELOG.md |
Initial changelog for the new package |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_null_stub.py
Show resolved
Hide resolved
durabletask-azurefunctions/durabletask/azurefunctions/worker.py
Outdated
Show resolved
Hide resolved
durabletask-azurefunctions/durabletask/azurefunctions/worker.py
Outdated
Show resolved
Hide resolved
| creationUrls: dict[str, str] | ||
| managementUrls: dict[str, str] |
Copilot
AI
Nov 21, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Potential compatibility issue with type hint syntax. The use of dict[str, str] (PEP 585 style) requires Python 3.9+. While pyproject.toml specifies requires-python = ">=3.9", consider whether this is the intended minimum version or if Dict[str, str] from typing should be used for broader compatibility.
...letask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_grpc_interceptor.py
Show resolved
Hide resolved
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_null_stub.py
Show resolved
Hide resolved
| if response is None: | ||
| raise Exception("Orchestrator execution did not produce a response.") | ||
| # The Python worker returns the input as type "json", so double-encoding is necessary | ||
| return '"' + base64.b64encode(response.SerializeToString()).decode('utf-8') + '"' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - Currently, the return value from here is passed on to the host as type "json" so the host attempts to Newtonsoft deserialize it back into an object before handing back to the Durable middleware for final decoding. This breaks, unless I double-encode with quotes as above. Is there a way to communicate to the worker that this is a plain string instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Investigating this - will need a little more time to test on my end
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is actually coming from the OrchestrationTriggerConverter done here. The type is hard-coded to json. Changing the type to string works in this scenario, but I'm unsure if there are other cases where json was intended
@classmethod
def encode(cls, obj: typing.Any, *, expected_type: typing.Optional[type]) -> meta.Datum:
# Durable function context should be a json
return meta.Datum(type='json', value=obj)
| # Obtain user-code and force type annotation on the client-binding parameter to be `str`. | ||
| # This ensures a passing type-check of that specific parameter, | ||
| # circumventing a limitation of the worker in type-checking rich DF Client objects. | ||
| # TODO: Once rich-binding type checking is possible, remove the annotation change. | ||
| user_code = fb._function._func | ||
| user_code.__annotations__[parameter_name] = str |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - this is the same approach taken by the existing Durable Python SDK for the DurableClient binding - we force the annotation to be "str" so the worker takes a path that does not attempt to use the DurableClientConverter input parameter converter, which would throw NotImplementedError
Do you think it is worth moving the client_constructor logic in this PR into the DurableClientConverter in the -library, so that we don't have to do this type-hacking stuff?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We'd have to figure out how to detect which underlying provider for the durable_client_input binding is being used to know when to simply return the string for the old SDK vs parse it in the new
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The main issue would be that we'd have something different to return based on the durable library.
Are the types going to be the same? (eg DurableClient for both packages) We could look at creating two separate converters - right now it's using the Generic converter, but it would be better to have our own
| requires-python = ">=3.9" | ||
| license = {file = "LICENSE"} | ||
| readme = "README.md" | ||
| dependencies = [ | ||
| "durabletask>=0.5.0", | ||
| "azure-identity>=1.19.0", | ||
| "azure-functions>=1.11.0" | ||
| ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: Update python min version and rev durabletask dependency to 1.0.1/1.1.0
Also rev durabletask versions to the same based on size of changes needed
|
|
||
| [project] | ||
| name = "durabletask.azurefunctions" | ||
| version = "0.1.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - what versioning strategy would you propose if the first version that goes to PyPi would be used for internal testing only?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dev versions work well - eg 1.0.0dev0 or 0.0.1dev0
| # TODO: Is there a better way to support retrieving the unwrapped user code? | ||
| df_client_middleware.client_function = fb._function._func # type: ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - not sure if you remember this context from a while back, but this is also carryover from the previous SDK - I added this line to make retrieving the "unwrapped" user code possible for the unit testing scenario - see
https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-unit-testing-python#unit-testing-trigger-functions
If possible, I'd like to see a "better" solution for the new SDK. Hate to re-open a can of worms here, though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I vaguely remember context, but we can sync again over specific requirements.
| stub = AzureFunctionsNullStub() | ||
| worker = DurableFunctionsWorker() | ||
| response: Optional[OrchestratorResponse] = None | ||
|
|
||
| def stub_complete(stub_response): | ||
| nonlocal response | ||
| response = stub_response | ||
| stub.CompleteOrchestratorTask = stub_complete |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of this is probably optimizable - do we really need to create a new stub and worker for each call? Can they be saved? Will look into this more at some point
|
Can you add an overview / more detail to the PR description? |
- Still needs eventSent and eventRecieved implementations
- Add new_uuid method to OrchestrationContext for deterministic replay-safe UUIDs - Fix entity locking behavior for Functions - Align _RuntimeOrchestrationContext param names with OrchestrationContext - Remap __init__.py files for new module - Update version to 0.0.1dev0 - Add docstrings to missing methods - Move code for executing orchestrators/entities to DurableFunctionsWorker - Add function metadata to triggers for detection by extension
| runs-on: ubuntu-latest | ||
| steps: | ||
| - uses: actions/checkout@v4 | ||
| - name: Set up Python 3.14 | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: 3.14 | ||
| - name: Install dependencies | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install setuptools wheel tox | ||
| pip install flake8 | ||
| - name: Run flake8 Linter | ||
| working-directory: durabletask-azurefunctions | ||
| run: flake8 . | ||
| - name: Run flake8 Linter | ||
| working-directory: tests/durabletask-azurefunctions | ||
| run: flake8 . | ||
|
|
||
| run-docker-tests: |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 23 hours ago
To address the problem, add an explicit permissions: block to the workflow to restrict the default permissions of the GITHUB_TOKEN.
The best and most secure method without altering existing workflow functionality is to add a top-level permissions: block with contents: read (sufficient for code checkout and most workflows), unless specific jobs require more, in which case you can override at the job level.
The changes should be made near the top of .github/workflows/durabletask-azurefunctions.yml, after the name: declaration and before on: (or immediately after, before the jobs section). No additional methods, external dependencies, or imports are needed.
-
Copy modified lines R2-R3
| @@ -1,4 +1,6 @@ | ||
| name: Durable Task Scheduler SDK (durabletask-azurefunctions) | ||
| permissions: | ||
| contents: read | ||
|
|
||
| on: | ||
| push: |
| strategy: | ||
| fail-fast: false | ||
| matrix: | ||
| python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"] | ||
| env: | ||
| EMULATOR_VERSION: "latest" | ||
| needs: lint | ||
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Checkout repository | ||
| uses: actions/checkout@v4 | ||
|
|
||
| - name: Pull Docker image | ||
| run: docker pull mcr.microsoft.com/dts/dts-emulator:$EMULATOR_VERSION | ||
|
|
||
| - name: Run Docker container | ||
| run: | | ||
| docker run --name dtsemulator -d -p 8080:8080 mcr.microsoft.com/dts/dts-emulator:$EMULATOR_VERSION | ||
|
|
||
| - name: Wait for container to be ready | ||
| run: sleep 10 # Adjust if your service needs more time to start | ||
|
|
||
| - name: Set environment variables | ||
| run: | | ||
| echo "TASKHUB=default" >> $GITHUB_ENV | ||
| echo "ENDPOINT=http://localhost:8080" >> $GITHUB_ENV | ||
|
|
||
| - name: Install durabletask dependencies | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install flake8 pytest | ||
| pip install -r requirements.txt | ||
|
|
||
| - name: Install durabletask-azurefunctions dependencies | ||
| working-directory: examples | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install -r requirements.txt | ||
|
|
||
| - name: Install durabletask-azurefunctions locally | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| pip install . --no-deps --force-reinstall | ||
|
|
||
| - name: Install durabletask locally | ||
| run: | | ||
| pip install . --no-deps --force-reinstall | ||
|
|
||
| - name: Run the tests | ||
| working-directory: tests/durabletask-azurefunctions | ||
| run: | | ||
| pytest -m "dts" --verbose | ||
|
|
||
| publish: |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 23 hours ago
To fix the problem, add an explicit permissions: block at the root of the workflow file (.github/workflows/durabletask-azurefunctions.yml). This will ensure all jobs use only the permissions required, and the default GITHUB_TOKEN permissions are restricted to the least privilege needed. For most workflows, setting contents: read is sufficient unless a job requires additional permissions. In this workflow, the publish job handles package publishing to PyPI using a secret and does not require contents: write, unless it also pushes tags or modifies repository content (which it does not shown based on the steps). Thus, the minimal safe change is to add a root-level permissions: contents: read entry near the top of the YAML file (typically right after name: and before on:).
-
Copy modified lines R2-R3
| @@ -1,4 +1,6 @@ | ||
| name: Durable Task Scheduler SDK (durabletask-azurefunctions) | ||
| permissions: | ||
| contents: read | ||
|
|
||
| on: | ||
| push: |
| if: startsWith(github.ref, 'refs/tags/azurefunctions-v') # Only run if a matching tag is pushed | ||
| needs: run-docker-tests | ||
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Checkout code | ||
| uses: actions/checkout@v4 | ||
|
|
||
| - name: Extract version from tag | ||
| run: echo "VERSION=${GITHUB_REF#refs/tags/azurefunctions-v}" >> $GITHUB_ENV # Extract version from the tag | ||
|
|
||
| - name: Set up Python | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: "3.14" # Adjust Python version as needed | ||
|
|
||
| - name: Install dependencies | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install build twine | ||
|
|
||
| - name: Build package from directory durabletask-azurefunctions | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| python -m build | ||
|
|
||
| - name: Check package | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine check dist/* | ||
|
|
||
| - name: Publish package to PyPI | ||
| env: | ||
| TWINE_USERNAME: __token__ | ||
| TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN_AZUREFUNCTIONS }} # Store your PyPI API token in GitHub Secrets | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine upload dist/* No newline at end of file |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 23 hours ago
To fix this issue, you should explicitly define a minimal permissions block in your workflow. Ideally, the workflow should declare permissions at the root level so all jobs default to least privilege unless they override this setting. The recommended setting for most workflows, particularly for CI and publishing to package registries, is contents: read, which allows jobs to read repository contents (e.g., for checkout), but not write or modify anything in the repository. Therefore, you should add:
permissions:
contents: readto the root of .github/workflows/durabletask-azurefunctions.yml, e.g., directly below the name: and before the on: block, or after the on: block but before jobs:. This will apply to all jobs in the workflow.
No additional imports, definitions, methods, or changes elsewhere are needed.
-
Copy modified lines R3-R5
| @@ -1,5 +1,8 @@ | ||
| name: Durable Task Scheduler SDK (durabletask-azurefunctions) | ||
|
|
||
| permissions: | ||
| contents: read | ||
|
|
||
| on: | ||
| push: | ||
| branches: |
DRAFT PR - for collaboration
Requires at minimum the changes here
Azure/azure-functions-durable-extension#3260
in durable WebJobs extension allowing gRPC protocol for Python