You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: dsl-reference.md
+18-14Lines changed: 18 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1589,17 +1589,17 @@ Represents the definition of the parameters that control the randomness or varia
1589
1589
1590
1590
### Input
1591
1591
1592
-
Documents the structure - and optionally configures the filtering of - workflow/task input data.
1592
+
Documents the structure - and optionally configures the transformation of - workflow/task input data.
1593
1593
1594
1594
It's crucial for authors to document the schema of input data whenever feasible. This documentation empowers consuming applications to provide contextual auto-suggestions when handling runtime expressions.
1595
1595
1596
-
When set, runtimes must validate input data against the defined schema, unless defined otherwise.
1596
+
When set, runtimes must validate raw input data against the defined schema before applying transformations, unless defined otherwise.
1597
1597
1598
1598
#### Properties
1599
1599
1600
1600
| Property | Type | Required | Description |
1601
1601
|----------|:----:|:--------:|-------------|
1602
-
| schema | [`schema`](#schema) | `no` | The [`schema`](#schema) used to describe and validate input data.<br>*Even though the schema is not required, it is strongly encouraged to document it, whenever feasible.* |
1602
+
| schema | [`schema`](#schema) | `no` | The [`schema`](#schema) used to describe and validate raw input data.<br>*Even though the schema is not required, it is strongly encouraged to document it, whenever feasible.* |
1603
1603
| from | `string`<br>`object` | `no` | A [runtime expression](dsl.md#runtime-expressions), if any, used to filter and/or mutate the workflow/task input. |
1604
1604
1605
1605
#### Examples
@@ -1610,9 +1610,16 @@ schema:
1610
1610
document:
1611
1611
type: object
1612
1612
properties:
1613
-
petId:
1614
-
type: string
1615
-
required: [ petId ]
1613
+
order:
1614
+
type: object
1615
+
required: [ pet ]
1616
+
properties:
1617
+
pet:
1618
+
type: object
1619
+
required: [ id ]
1620
+
properties:
1621
+
id:
1622
+
type: string
1616
1623
from: .order.pet
1617
1624
```
1618
1625
@@ -1622,7 +1629,7 @@ Documents the structure - and optionally configures the transformations of - wor
1622
1629
1623
1630
It's crucial for authors to document the schema of output data whenever feasible. This documentation empowers consuming applications to provide contextual auto-suggestions when handling runtime expressions.
1624
1631
1625
-
When set, runtimes must validate output data against the defined schema, unless defined otherwise.
1632
+
When set, runtimes must validate output data against the defined schema after applying transformations, unless defined otherwise.
1626
1633
1627
1634
#### Properties
1628
1635
@@ -1645,16 +1652,13 @@ output:
1645
1652
required: [ petId ]
1646
1653
as:
1647
1654
petId: '${ .pet.id }'
1648
-
export:
1649
-
as:
1650
-
'.petList += [ $task.output ]'
1651
1655
```
1652
1656
1653
1657
### Export
1654
1658
1655
-
Certain task needs to set the workflow context to save the task output for later usage. Users set the content of the context through a runtime expression. The result of the expression is the new value of the context. The expression is evaluated against the existing context.
1659
+
Certain task needs to set the workflow context to save the task output for later usage. Users set the content of the context through a runtime expression. The result of the expression is the new value of the context. The expression is evaluated against the transformed task output.
1656
1660
1657
-
Optionally, the context might have an associated schema.
1661
+
Optionally, the context might have an associated schema which is validated against the result of the expression.
1658
1662
1659
1663
#### Properties
1660
1664
@@ -1668,13 +1672,13 @@ Optionally, the context might have an associated schema.
Copy file name to clipboardExpand all lines: dsl.md
+57-21Lines changed: 57 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -181,98 +181,134 @@ Once the task has been executed, different things can happen:
181
181
182
182
### Data Flow
183
183
184
-
In Serverless Workflow DSL, data flow management is crucial to ensure that the right data is passed between tasks and to the workflow itself.
184
+
In Serverless Workflow DSL, data flow management is crucial to ensure that the right data is passed between tasks and to the workflow itself.
185
185
186
186
Here's how data flows through a workflow based on various transformation stages:
187
187
188
-
1.**Transform Workflow Input**
188
+
1.**Validate Workflow Input**
189
+
Before the workflow starts, the input data provided to the workflow can be validated against the `input.schema` property to ensure it conforms to the expected structure.
190
+
The execution only proceeds if the input is valid. Otherwise, it will fault with a [ValidationError (https://serverlessworkflow.io/spec/1.0.0/errors/validation)](dsl-reference.md#error).
191
+
192
+
2.**Transform Workflow Input**
189
193
Before the workflow starts, the input data provided to the workflow can be transformed to ensure only relevant data in the expected format is passed into the workflow context. This can be done using the top level `input.from` expression. It evaluates on the raw workflow input and defaults to the identity expression which leaves the input unchanged. This step allows the workflow to start with a clean and focused dataset, reducing potential overhead and complexity in subsequent tasks. The result of this expression will set as the initial value for the `$context` runtime expression argument and be passed to the first task.
190
194
191
195
*Example: If the workflow receives a JSON object as input, a transformation can be applied to remove unnecessary fields and retain only those that are required for the workflow's execution.*
192
196
193
-
2.**Transform First Task Input**
194
-
The input data for the first task can be transformed to match the specific requirements of that task. This ensures that the first task receives only the data required to perform its operations. This can be done using the task's `input.from` expression. It evaluates the transformed workflow input and defaults to the identity expression, which leaves the input unchanged. The result of this expression will be set as the `$input` runtime expression argument and be passed to the task. This transformed input will be evaluated against any runtime expressions used within the task definition.
197
+
After workflow input validation and transformation, the transformed input is passed as the raw input to the first task.
198
+
199
+
3.**Validate Task Input**
200
+
Before a task executes, its raw input can be validated against the `input.schema` property to ensure it conforms to the expected structure.
201
+
The execution only proceeds if the input is valid. Otherwise, it will fault with a [ValidationError (https://serverlessworkflow.io/spec/1.0.0/errors/validation)](dsl-reference.md#error).
195
202
196
-
*Example: If the first task is a function call that only needs a subset of the workflow input, a transformation can be applied to provide only those fields needed for the function to execute.*
203
+
4.**Transform Task Input**
204
+
The input data for the task can be transformed to match the specific requirements of that task. This ensures that the task receives only the data required to perform its operations. This can be done using the task's `input.from` expression. It evaluates the raw task input (i.e., the transformed workflow input for the first task or the transformed output of the previous task) and defaults to the identity expression, which leaves the input unchanged. The result of this expression will be set as the `$input` runtime expression argument and be passed to the task. This transformed input will be evaluated against any runtime expressions used within the task definition.
197
205
198
-
3.**Transform First Task Output**
199
-
After completing the first task, its output can be transformed before passing it to the next task or storing it in the workflow context. Transformations are applied using the `output.as` runtime expression. It evaluates the raw task output and defaults to the identity expression, which leaves the output unchanged. Its result will be input for the next task. To update the context, one uses the `export.as` runtime expression. It evaluates the raw output and defaults to the expression that returns the existing context. The result of this runtime expression replaces the workflow's current context and the content of the `$context` runtime expression argument. This helps manage the data flow and keep the context clean by removing any unnecessary data produced by the task.
206
+
*Example: If the task is a function call that only needs a subset of the workflow input, a transformation can be applied to provide only those fields needed for the function to execute.*
200
207
201
-
*Example: If the first task returns a large dataset, a transformation can be applied to retain only the relevant results needed for subsequent tasks.*
208
+
5.**Transform Task Output**
209
+
After completing the task, its output can be transformed before passing it to the next task or storing it in the workflow context. Transformations are applied using the `output.as` runtime expression. It evaluates the raw task output and defaults to the identity expression, which leaves the output unchanged. Its result will be input for the next task.
202
210
203
-
4.**Transform Last Task Input**
204
-
Before the last task in the workflow executes, its input data can be transformed to ensure it receives only the necessary information. This can be done using the task's `input.from` expression. It evaluates the transformed workflow input and defaults to the identity expression, which leaves the input unchanged. The result of this expression will be set as the `$input` runtime expression argument and be passed to the task. This transformed input will be evaluated against any runtime expressions used within the task definition. This step is crucial for ensuring the final task has all the required data to complete the workflow successfully.
211
+
*Example: If the task returns a large dataset, a transformation can be applied to retain only the relevant results needed for subsequent tasks.*
205
212
206
-
*Example: If the last task involves generating a report, the input transformation can ensure that only the data required for the report generation is passed to the task.*
213
+
6.**Validate Task Output**
214
+
After `output.as` is evaluated, the transformed task output is validated against the `output.schema` property to ensure it conforms to the expected structure. The execution only proceeds if the output is valid. Otherwise, it will fault with a [ValidationError (https://serverlessworkflow.io/spec/1.0.0/errors/validation)](dsl-reference.md#error).
207
215
208
-
5.**Transform Last Task Output**
209
-
After the last task completes, its output can be transformed before it is considered the workflow output. Transformations are applied using the `output.as` runtime expression. It evaluates the raw task output and defaults to the identity expression, which leaves the output unchanged. Its result will be passed to the workflow`output.as` runtime expression. This ensures that the workflow produces a clean and relevant output, free from any extraneous data that might have been generated during the task execution.
216
+
7.**Update Workflow Context**
217
+
To update the context, one uses the `export.as` runtime expression. It evaluates the transformed task output and defaults to the expression that returns the existing context. The result of this runtime expression replaces the workflow's current context and the content of the `$context` runtime expression argument. This helps manage the data flow and keep the context clean by removing any unnecessary data produced by the task.
210
218
211
-
*Example: If the last task outputs various statistics, a transformation can be applied to retain only the key metrics that are relevant to the stakeholders.*
219
+
8.**Validate Exported Context**
220
+
After the context is updated, the exported context is validated against the `export.schema` property to ensure it conforms to the expected structure. The execution only proceeds if the exported context is valid. Otherwise, it will fault with a [ValidationError (https://serverlessworkflow.io/spec/1.0.0/errors/validation)](dsl-reference.md#error).
212
221
213
-
6.**Transform Workflow Output**
214
-
Finally, the overall workflow output can be transformed before it is returned to the caller or stored. Transformations are applied using the `output.as` runtime expression. It evaluates the last task's output and defaults to the identity expression, which leaves the output unchanged. This step ensures that the final output of the workflow is concise and relevant, containing only the necessary information that needs to be communicated or recorded.
222
+
9.**Continue Workflow**
223
+
After the context is updated, the workflow continues to the next task in the sequence. The transformed output of the previous task is passed as the raw input to the next task, and the data flow cycle repeats.
224
+
If no more tasks are defined, the transformed output is passed to the workflow output transformation step.
225
+
226
+
10.**Transform Workflow Output**
227
+
Finally, the overall workflow output can be transformed before it is returned to the caller or stored. Transformations are applied using the `output.as` runtime expression. It evaluates the last task's transformed output and defaults to the identity expression, which leaves the output unchanged. This step ensures that the final output of the workflow is concise and relevant, containing only the necessary information that needs to be communicated or recorded.
215
228
216
229
*Example: If the workflow's final output is a summary report, a transformation can ensure that the report contains only the most important summaries and conclusions, excluding any intermediate data.*
217
230
231
+
11.**Validate Workflow Output**
232
+
After `output.as` is evaluated, the transformed workflow output is validated against the `output.schema` property to ensure it conforms to the expected structure. The execution only proceeds if the output is valid. Otherwise, it will fault with a [ValidationError (https://serverlessworkflow.io/spec/1.0.0/errors/validation)](dsl-reference.md#error).
233
+
218
234
By applying transformations at these strategic points, Serverless Workflow DSL ensures that data flows through the workflow in a controlled and efficient manner, maintaining clarity and relevance at each execution stage. This approach helps manage complex workflows and ensures that each task operates with the precise data required, leading to more predictable and reliable workflow outcomes.
0 commit comments