You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -11,7 +11,7 @@ import { WranglerConfig } from "~/components"
11
11
12
12
The Python Workers platform leverages FFI to access bindings to Cloudflare resources. Refer to the [bindings](/workers/languages/python/ffi/#using-bindings-from-python-workers) documentation for more information.
13
13
14
-
From the configuration perspective, there is no difference between configuring a Python workflow or a Javascript one.
14
+
From the configuration perspective, enabling Python Workflows requires adding the `python_workflows` compatibility flag to your `wrangler.toml` file.
15
15
16
16
<WranglerConfig>
17
17
@@ -20,6 +20,7 @@ From the configuration perspective, there is no difference between configuring a
Note that `env` is a Javascript object exposed to the Python script via [JsProxy](https://pyodide.org/en/stable/usage/api/python-api/ffi.html#pyodide.ffi.JsProxy). You can
38
-
access the binding like you would on a Javascript worker. Refer to the [Workflow binding documentation](/workflows/build/workers-api/#workflow) to learn more about the methods available.
39
-
40
-
Let's consider the previous binding called `MY_WORKFLOW`. Here's how you would create a new instance:
37
+
And this is how you use the payload in your workflow:
41
38
42
39
```python
43
-
asyncdefon_fetch(request, env):
44
-
instance =await env.MY_WORKFLOW.create()
45
-
return Response.json({"status": "success"})
40
+
from pyodide.ffi import to_js
41
+
42
+
classDemoWorkflowClass(WorkflowEntrypoint):
43
+
asyncdefrun(self, event, step):
44
+
@step.do('step-name')
45
+
asyncdeffirst_step():
46
+
payload = event["payload"]
47
+
return payload
46
48
```
47
49
48
-
### Pass a payload to a workflow instance
50
+
51
+
## Workflow
52
+
53
+
The `Workflow` binding gives you access to the [Workflow](/workflows/build/workers-api/#workflow) class. All its methods are available
54
+
on the binding.
55
+
56
+
Under the hood, the `Workflow` binding is a Javascript object that is exposed to the Python script via [JsProxy](https://pyodide.org/en/stable/usage/api/python-api/ffi.html#pyodide.ffi.JsProxy).
57
+
This means that the values returned by its methods are also `JsProxy` objects, and need to be converted back into Python objects using `python_from_rpc`.
58
+
59
+
60
+
### `create`
61
+
62
+
Create (trigger) a new instance of a given Workflow.
63
+
64
+
* <code>create(options=None)</code>
65
+
*`options` - an **optional** dictionary of options to pass to the workflow instance. Should contain the same keys
66
+
as the [WorkflowInstanceCreateOptions](/workflows/build/workers-api/#workflowinstancecreateoptions) type.
49
67
50
68
```python
51
69
from pyodide.ffi import to_js
52
70
53
71
asyncdefon_fetch(request, env, ctx):
54
72
event = {"foo": "bar"}
55
-
#to_js here is required because the binding goes through ffi. Not something we can wrap or override on the runtime
@@ -62,16 +80,74 @@ Values returned from steps need to be converted into Javascript objects using `t
62
80
63
81
:::
64
82
83
+
The `create` method returns a [`WorkflowInstance`](/workflows/build/workers-api/#workflowinstance) object, which can be used to query the status of the workflow instance. Note that this is a Javascript object, and not a Python object.
65
84
66
-
And this is how you use the payload in your workflow:
85
+
### `create_batch`
86
+
87
+
Create (trigger) a batch of new workflow instances, up to 100 instances at a time. This is useful if you need to create multiple instances at once within the [instance creation limit](/workflows/reference/limits/).
88
+
89
+
* <code>create_batch(batch)</code>
90
+
*`batch` - list of `WorkflowInstanceCreateOptions` to pass when creating an instance, including a user-provided ID and payload parameters.
91
+
92
+
Each element of the `batch` list is expected to include both `id` and `params` properties:
67
93
68
94
```python
69
95
from pyodide.ffi import to_js
70
96
71
-
classDemoWorkflowClass(WorkflowEntrypoint):
72
-
asyncdefon_run(self, event, step):
73
-
@step.do('step-name')
74
-
asyncdeffirst_step():
75
-
payload = event["payload"]
76
-
return payload
77
-
```
97
+
# Create a new batch of 3 Workflow instances, each with its own ID and pass params to the Workflow instances
Copy file name to clipboardExpand all lines: src/content/docs/workflows/python/dag.mdx
+5-3Lines changed: 5 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,13 +6,13 @@ sidebar:
6
6
7
7
---
8
8
9
-
The Python SDK supports DAG workflows in a declarative way, using the `step.do` decorator with the `depends` parameter to define dependencies (other steps that must complete before this step can run).
9
+
The Python Workflows SDK supports DAG workflows in a declarative way, using the `step.do` decorator with the `depends` parameter to define dependencies (other steps that must complete before this step can run).
10
10
11
11
```python
12
12
from workers import WorkflowEntrypoint
13
13
14
14
classMyWorkflow(WorkflowEntrypoint):
15
-
asyncdefon_run(self, event, step):
15
+
asyncdefrun(self, event, step):
16
16
@step.do("dependency a")
17
17
asyncdefstep_a():
18
18
# do some work
@@ -31,6 +31,8 @@ class MyWorkflow(WorkflowEntrypoint):
31
31
await my_final_step()
32
32
```
33
33
34
+
On this example, `step_a` and `step_b` are run concurrently before execution of `my_final_step`, which depends on both of them.
35
+
34
36
Having `concurrent=True` allows the dependencies to be resolved concurrently. If one of the callables passed to `depends` has already completed, it will be skipped and its return value will be reused.
35
37
36
-
This pattern is usefull for diamond shaped workflows, where a step depends on two or more other steps that can run concurrently.
38
+
This pattern is useful for diamond shaped workflows, where a step depends on two or more other steps that can run concurrently.
Copy file name to clipboardExpand all lines: src/content/docs/workflows/python/index.mdx
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: Python SDK
2
+
title: Python Workflows SDK
3
3
pcx_content_type: navigation
4
4
sidebar:
5
5
order: 5
@@ -13,20 +13,20 @@ Refer to [Python Workers](/workers/languages/python) for more information about
13
13
14
14
:::caution[Python Workflows are in beta, as well as the underlying platform.]
15
15
16
-
You must add the`python_workflows` compatibility flag to your `wrangler.toml` file, as well as `python_workers`.
16
+
You must add both`python_workflows`and `python_workers`compatibility flags to your `wrangler.toml` file.
17
17
18
18
Join the #python-workflows channel in the [Cloudflare Developers Discord](https://discord.cloudflare.com/) and let us know what you'd like to see next.
19
19
:::
20
20
21
21
## Get Started
22
22
23
-
Similarly to Typescript, the main entrypoint for a Python workflow is the [`WorkflowEntrypoint`](/workflows/build/workers-api/#workflowentrypoint) class. Your workflow logic should exist inside the [`run`](/workflows/build/workers-api/#run) handler. In a Python workflow, this handler is named `on_run`.
23
+
The main entrypoint for a Python workflow is the [`WorkflowEntrypoint`](/workflows/build/workers-api/#workflowentrypoint) class. Your workflow logic should exist inside the [`run`](/workflows/build/workers-api/#run) handler.
This guide covers the Python Workflows SDK, with instructions of how to build and create workflows using Python.
10
+
11
+
## WorkflowEntrypoint
12
+
13
+
The `WorkflowEntrypoint` is the main entrypoint for a Python workflow. It extends the `WorkflowEntrypoint` class, and implements the `run` method.
14
+
15
+
```python
16
+
from workers import WorkflowEntrypoint
17
+
18
+
classMyWorkflow(WorkflowEntrypoint):
19
+
defrun(self, event, step):
20
+
# steps here
21
+
```
22
+
23
+
## WorkflowStep
24
+
25
+
* <code>step.do(name, depends=[], concurrent=False, config=None)</code> is a decorator that allows you to define a step in a workflow.
26
+
*`name` - the name of the step.
27
+
*`depends` - an optional list of steps that must complete before this step can run. See [DAG Workflows](/workflows/python/dag).
28
+
*`concurrent` - an optional boolean that indicates whether this step can run concurrently with other steps.
29
+
*`config` - an optional [`WorkflowStepConfig`](/workflows/build/workers-api/#workflowstepconfig) for configuring [step specific retry behaviour](/workflows/build/sleeping-and-retrying/). This is passed as a Python dictionary and then type translated into a `WorkflowStepConfig` object.
30
+
31
+
```python
32
+
from workers import WorkflowEntrypoint
33
+
34
+
classMyWorkflow(WorkflowEntrypoint):
35
+
asyncdefrun(self, event, step):
36
+
@step.do("my first step")
37
+
asyncdefmy_first_step():
38
+
# do some work
39
+
return"Hello World!"
40
+
41
+
await my_first_step()
42
+
```
43
+
44
+
Note that the decorator doesn't make the call to the step, it just returns a callable that can be used to invoke the step. You have to call the callable to make the step run.
45
+
46
+
When returning state from a step, you must make sure that the returned value is serializable. Since steps run through an FFI layer, the returned value gets type translated via [FFI.](https://pyodide.org/en/stable/usage/api/python-api/ffi.html#pyodide.ffi.to_js)
47
+
Refer to [Pyodide's documentation](https://pyodide.org/en/stable/usage/type-conversions.html#type-translations-pyproxy-to-js) regarding type conversions for more information.
48
+
49
+
* <code>step.sleep(name, duration)</code>
50
+
51
+
*`name` - the name of the step.
52
+
*`duration` - the duration to sleep until, in either seconds or as a `WorkflowDuration` compatible string.
53
+
54
+
```python
55
+
asyncdefrun(self, event, step):
56
+
await step.sleep("my-sleep-step", "10 seconds")
57
+
```
58
+
59
+
* <code>step.sleep_until(name, timestamp)</code>
60
+
61
+
*`name` - the name of the step.
62
+
*`timestamp` - a `datetime.datetime` object or seconds from the Unix epoch to sleep the workflow instance until.
The `event` parameter is a dictionary that contains the payload passed to the workflow instance, along with other metadata:
84
+
85
+
* <code>payload</code> - the payload passed to the workflow instance.
86
+
* <code>timestamp</code> - the timestamp that the workflow was triggered.
87
+
* <code>instanceId</code> - the ID of the current workflow instance.
88
+
* <code>workflowName</code> - the name of the workflow.
89
+
90
+
## Error Handling
91
+
92
+
Workflows semantics allow users to catch exceptions that get thrown to the top level.
93
+
94
+
Catching specific exceptions within an `except` block may not work, as some Python errors will not be re-instantiated into the same type of error when they are passed through the RPC layer.
95
+
96
+
:::note
97
+
98
+
Some built-in Python errors (e.g.: `ValueError`, `TypeError`) will work correctly. User defined exceptions, as well as other built-in Python errors will not and should be caught
The Python Workflows SDK provides a `NonRetryableError` class that can be used to signal that a step should not be retried.
121
+
122
+
```python
123
+
from workers.workflows import NonRetryableError
124
+
125
+
raise NonRetryableError(message)
126
+
```
127
+
128
+
## Configure a workflow instance
129
+
130
+
You can bind a step to a specific retry policy by passing a `WorkflowStepConfig` object to the `config` parameter of the `step.do` decorator.
131
+
With Python Workflows, you need to make sure that your `dict` respects the [`WorkflowStepConfig`](/workflows/build/workers-api/#workflowstepconfig) type.
Note that `env` is a Javascript object exposed to the Python script via [JsProxy](https://pyodide.org/en/stable/usage/api/python-api/ffi.html#pyodide.ffi.JsProxy). You can
145
+
access the binding like you would on a Javascript worker. Refer to the [Workflow binding documentation](/workflows/build/workers-api/#workflow) to learn more about the methods available.
146
+
147
+
Let's consider the previous binding called `MY_WORKFLOW`. Here's how you would create a new instance:
0 commit comments