Skip to content

Commit 2d9949b

Browse files
authored
Merge pull request #186552 from jonburchel/2022-01-27-adds-ui-to-concepts-pipeline-execution-triggers
Adds UI to concepts-pipeline-execution-triggers.md
2 parents e9cf5af + c7e46b5 commit 2d9949b

12 files changed

+43
-26
lines changed

articles/data-factory/author-management-hub.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ To override the generated Resource Manager template parameters when publishing f
5151

5252
### Triggers
5353

54-
Triggers determine when a pipeline run should be kicked off. Currently triggers can be on a wall clock schedule, operate on a periodic interval, or depend on an event. For more information, learn about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution). In the management hub, you can create, edit, delete, or view the current state of a trigger.
54+
Triggers determine when a pipeline run should be kicked off. Currently triggers can be on a wall clock schedule, operate on a periodic interval, or depend on an event. For more information, learn about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution-with-json). In the management hub, you can create, edit, delete, or view the current state of a trigger.
5555

5656
:::image type="content" source="media/author-management-hub/management-hub-triggers.png" alt-text="Screenshot that shows where to create, edit, delete, nor view the current state of a trigger.":::
5757

articles/data-factory/compare-versions.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ The following table compares the features of Data Factory with the features of D
1919

2020
| Feature | Version 1 | Current version |
2121
| ------- | --------- | --------- |
22-
| Datasets | A named view of data that references the data that you want to use in your activities as inputs and outputs. Datasets identify data within different data stores, such as tables, files, folders, and documents. For example, an Azure Blob dataset specifies the blob container and folder in Azure Blob storage from which the activity should read the data.<br/><br/>**Availability** defines the processing window slicing model for the dataset (for example, hourly, daily, and so on). | Datasets are the same in the current version. However, you do not need to define **availability** schedules for datasets. You can define a trigger resource that can schedule pipelines from a clock scheduler paradigm. For more information, see [Triggers](concepts-pipeline-execution-triggers.md#trigger-execution) and [Datasets](concepts-datasets-linked-services.md). |
22+
| Datasets | A named view of data that references the data that you want to use in your activities as inputs and outputs. Datasets identify data within different data stores, such as tables, files, folders, and documents. For example, an Azure Blob dataset specifies the blob container and folder in Azure Blob storage from which the activity should read the data.<br/><br/>**Availability** defines the processing window slicing model for the dataset (for example, hourly, daily, and so on). | Datasets are the same in the current version. However, you do not need to define **availability** schedules for datasets. You can define a trigger resource that can schedule pipelines from a clock scheduler paradigm. For more information, see [Triggers](concepts-pipeline-execution-triggers.md#trigger-execution-with-json) and [Datasets](concepts-datasets-linked-services.md). |
2323
| Linked services | Linked services are much like connection strings, which define the connection information that's necessary for Data Factory to connect to external resources. | Linked services are the same as in Data Factory V1, but with a new **connectVia** property to utilize the Integration Runtime compute environment of the current version of Data Factory. For more information, see [Integration runtime in Azure Data Factory](concepts-integration-runtime.md) and [Linked service properties for Azure Blob storage](connector-azure-blob-storage.md#linked-service-properties). |
2424
| Pipelines | A data factory can have one or more pipelines. A pipeline is a logical grouping of activities that together perform a task. You use startTime, endTime, and isPaused to schedule and run pipelines. | Pipelines are groups of activities that are performed on data. However, the scheduling of activities in the pipeline has been separated into new trigger resources. You can think of pipelines in the current version of Data Factory more as "workflow units" that you schedule separately via triggers. <br/><br/>Pipelines do not have "windows" of time execution in the current version of Data Factory. The Data Factory V1 concepts of startTime, endTime, and isPaused are no longer present in the current version of Data Factory. For more information, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md) and [Pipelines and activities](concepts-pipelines-activities.md). |
2525
| Activities | Activities define actions to perform on your data within a pipeline. Data movement (copy activity) and data transformation activities (such as Hive, Pig, and MapReduce) are supported. | In the current version of Data Factory, activities still are defined actions within a pipeline. The current version of Data Factory introduces new [control flow activities](concepts-pipelines-activities.md#control-flow-activities). You use these activities in a control flow (looping and branching). Data movement and data transformation activities that were supported in V1 are supported in the current version. You can define transformation activities without using datasets in the current version. |
@@ -139,4 +139,4 @@ in the current version, you can also monitor data factories by using [Azure Moni
139139

140140

141141
## Next steps
142-
Learn how to create a data factory by following step-by-step instructions in the following quickstarts: [PowerShell](quickstart-create-data-factory-powershell.md), [.NET](quickstart-create-data-factory-dot-net.md), [Python](quickstart-create-data-factory-python.md), [REST API](quickstart-create-data-factory-rest-api.md).
142+
Learn how to create a data factory by following step-by-step instructions in the following quickstarts: [PowerShell](quickstart-create-data-factory-powershell.md), [.NET](quickstart-create-data-factory-dot-net.md), [Python](quickstart-create-data-factory-python.md), [REST API](quickstart-create-data-factory-rest-api.md).

articles/data-factory/concepts-pipeline-execution-triggers.md

Lines changed: 24 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.reviewer: jburchel
88
ms.service: data-factory
99
ms.subservice: orchestration
1010
ms.topic: conceptual
11-
ms.date: 09/09/2021
11+
ms.date: 01/27/2022
1212
ms.custom: devx-track-azurepowershell, synapse
1313
---
1414

@@ -23,7 +23,24 @@ A _pipeline run_ in Azure Data Factory and Azure Synapse defines an instance of
2323

2424
Pipeline runs are typically instantiated by passing arguments to parameters that you define in the pipeline. You can execute a pipeline either manually or by using a _trigger_. This article provides details about both ways of executing a pipeline.
2525

26-
## Manual execution (on-demand)
26+
## Create triggers with UI
27+
28+
To manually trigger a pipeline or configure a new scheduled, tumbling window, storage event, or custom event trigger, select Add trigger at the top of the pipeline editor.
29+
30+
:::image type="content" source="media/concepts-pipeline-execution-triggers/manual-trigger.png" alt-text="Shows how to add a new trigger with UI from the pipeline editor.":::
31+
32+
If you choose to manually trigger the pipeline, it will execute immediately. Otherwise if you choose New/Edit, you will be prompted with the add triggers window to either choose an existing trigger to edit, or create a new trigger.
33+
34+
:::image type="content" source="media/concepts-pipeline-execution-triggers/new-trigger.png" alt-text="Shows the add triggers window highlighting where to create a new trigger.":::
35+
36+
You will see the trigger configuration window, allowing you to choose the trigger type.
37+
38+
:::image type="content" source="media/concepts-pipeline-execution-triggers/new-trigger-configuration.png" alt-text="Shows the new trigger configuration window with the type dropdown showing the various types of triggers you can create.":::
39+
40+
Read more about [scheduled](#schedule-trigger-with-json), [tumbling window](#tumbling-window-trigger), [storage event](#event-based-trigger), and [custom event](#event-based-trigger) triggers below.
41+
42+
43+
## Manual execution (on-demand) with JSON
2744

2845
The manual execution of a pipeline is also referred to as _on-demand_ execution.
2946

@@ -132,7 +149,7 @@ For a complete sample, see [Quickstart: Create a data factory by using the .NET
132149
> [!NOTE]
133150
> You can use the .NET SDK to invoke pipelines from Azure Functions, from your web services, and so on.
134151
135-
## Trigger execution
152+
## Trigger execution with JSON
136153

137154
Triggers are another way that you can execute a pipeline run. Triggers represent a unit of processing that determines when a pipeline execution needs to be kicked off. Currently, the service supports three types of triggers:
138155

@@ -142,7 +159,7 @@ Triggers are another way that you can execute a pipeline run. Triggers represent
142159

143160
- Event-based trigger: A trigger that responds to an event.
144161

145-
Pipelines and triggers have a many-to-many relationship (except for the tumbling window trigger).Multiple triggers can kick off a single pipeline, or a single trigger can kick off multiple pipelines. In the following trigger definition, the **pipelines** property refers to a list of pipelines that are triggered by the particular trigger. The property definition includes values for the pipeline parameters.
162+
Pipelines and triggers have a many-to-many relationship (except for the tumbling window trigger). Multiple triggers can kick off a single pipeline, or a single trigger can kick off multiple pipelines. In the following trigger definition, the **pipelines** property refers to a list of pipelines that are triggered by the particular trigger. The property definition includes values for the pipeline parameters.
146163
### Basic trigger definition
147164

148165
```json
@@ -170,7 +187,7 @@ Pipelines and triggers have a many-to-many relationship (except for the tumbling
170187
}
171188
```
172189

173-
## Schedule trigger
190+
## Schedule trigger with JSON
174191
A schedule trigger runs pipelines on a wall-clock schedule. This trigger supports periodic and advanced calendar options. For example, the trigger supports intervals like "weekly" or "Monday at 5:00 PM and Thursday at 9:00 PM." The schedule trigger is flexible because the dataset pattern is agnostic, and the trigger doesn't discern between time-series and non-time-series data.
175192

176193
For more information about schedule triggers and, for examples, see [Create a trigger that runs a pipeline on a schedule](how-to-create-schedule-trigger.md).
@@ -376,10 +393,10 @@ The following table provides a comparison of the tumbling window trigger and sch
376393

377394
## Event-based trigger
378395

379-
An event-based trigger runs pipelines in response to an event. There are two flavors of event based triggers.
396+
An event-based trigger runs pipelines in response to an event. There are two flavors of event-based triggers.
380397

381398
* _Storage event trigger_ runs a pipeline against events happening in a Storage account, such as the arrival of a file, or the deletion of a file in Azure Blob Storage account.
382-
* _Custom event trigger_ processes and handles [custom topics](../event-grid/custom-topics.md) in Event Grid
399+
* _Custom event trigger_ processes and handles [custom articles](../event-grid/custom-topics.md) in Event Grid
383400

384401
For more information about event-based triggers, see [Storage Event Trigger](how-to-create-event-trigger.md) and [Custom Event Trigger](how-to-create-custom-event-trigger.md).
385402

articles/data-factory/control-flow-system-variables.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ These system variables can be referenced anywhere in the pipeline JSON.
4040
4141
## Schedule trigger scope
4242

43-
These system variables can be referenced anywhere in the trigger JSON for triggers of type [ScheduleTrigger](concepts-pipeline-execution-triggers.md#schedule-trigger).
43+
These system variables can be referenced anywhere in the trigger JSON for triggers of type [ScheduleTrigger](concepts-pipeline-execution-triggers.md#schedule-trigger-with-json).
4444

4545
| Variable Name | Description |
4646
| --- | --- |
@@ -80,7 +80,7 @@ These system variables can be referenced anywhere in the trigger JSON for trigge
8080
8181
| Variable Name | Description
8282
| --- | --- |
83-
| @triggerBody().event.eventType | Type of events that triggered the Custom Event Trigger run. Event type is customer defined field and take on any values of string type. |
83+
| @triggerBody().event.eventType | Type of events that triggered the Custom Event Trigger run. Event type is customer-defined field and take on any values of string type. |
8484
| @triggerBody().event.subject | Subject of the custom event that caused the trigger to fire. |
8585
| @triggerBody().event.data._keyName_ | Data field in custom event is a free from JSON blob, which customer can use to send messages and data. Please use data._keyName_ to reference each field. For example, @triggerBody().event.data.callback returns the value for the _callback_ field stored under _data_. |
8686
| @trigger().startTime | Time at which the trigger fired to invoke the pipeline run. |

articles/data-factory/how-to-create-custom-event-trigger.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -110,9 +110,9 @@ As of today custom event trigger supports a __subset__ of [advanced filtering op
110110
* StringIn
111111
* StringNotIn
112112

113-
Click **+New** to add new filter conditions.
113+
Select **+New** to add new filter conditions.
114114

115-
Additionally, custom event triggers obey the [same limitations as event grid](../event-grid/event-filtering.md#limitations), including:
115+
Additionally, custom event triggers obey the [same limitations as Event Grid](../event-grid/event-filtering.md#limitations), including:
116116

117117
* 5 advanced filters and 25 filter values across all the filters per custom event trigger
118118
* 512 characters per string value
@@ -128,7 +128,7 @@ The following table provides an overview of the schema elements that are related
128128

129129
| JSON element | Description | Type | Allowed values | Required |
130130
|---|----------------------------|---|---|---|
131-
| `scope` | The Azure Resource Manager resource ID of the event grid topic. | String | Azure Resource Manager ID | Yes |
131+
| `scope` | The Azure Resource Manager resource ID of the Event Grid topic. | String | Azure Resource Manager ID | Yes |
132132
| `events` | The type of events that cause this trigger to fire. | Array of strings | | Yes, at least one value is expected. |
133133
| `subjectBeginsWith` | The `subject` field must begin with the provided pattern for the trigger to fire. For example, _factories_ only fire the trigger for event subjects that start with *factories*. | String | | No |
134134
| `subjectEndsWith` | The `subject` field must end with the provided pattern for the trigger to fire. | String | | No |
@@ -143,11 +143,11 @@ Azure Data Factory uses Azure role-based access control (RBAC) to prohibit unaut
143143

144144
To successfully create or update a custom event trigger, you need to sign in to Data Factory with an Azure account that has appropriate access. Otherwise, the operation will fail with an _Access Denied_ error.
145145

146-
Data Factory doesn't require special permission to your Event Grid. You also do *not* need to assign special Azure RBAC permission to the Data Factory service principal for the operation.
146+
Data Factory doesn't require special permission to your Event Grid. You also do *not* need to assign special Azure RBAC role permission to the Data Factory service principal for the operation.
147147

148148
Specifically, you need `Microsoft.EventGrid/EventSubscriptions/Write` permission on `/subscriptions/####/resourceGroups//####/providers/Microsoft.EventGrid/topics/someTopics`.
149149

150150
## Next steps
151151

152-
* Get detailed information about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution).
152+
* Get detailed information about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution-with-json).
153153
* Learn how to [reference trigger metadata in pipeline runs](how-to-use-trigger-parameterization.md).

articles/data-factory/how-to-create-event-trigger.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -157,5 +157,5 @@ There are three noticeable call outs in the workflow related to Event triggering
157157

158158
## Next steps
159159

160-
* For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution).
160+
* For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution-with-json).
161161
* Learn how to reference trigger metadata in pipeline, see [Reference Trigger Metadata in Pipeline Runs](how-to-use-trigger-parameterization.md)

articles/data-factory/how-to-create-schedule-trigger.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,7 @@ This section shows you how to use Azure CLI to create, start, and monitor a sche
206206

207207
### Sample Code
208208

209-
1. In your working direactory, create a JSON file named **MyTrigger.json** with the trigger's properties. For this example use the following content:
209+
1. In your working directory, create a JSON file named **MyTrigger.json** with the trigger's properties. For this example use the following content:
210210

211211
> [!IMPORTANT]
212212
> Before you save the JSON file, set the value of the **startTime** element to the current UTC time. Set the value of the **endTime** element to one hour past the current UTC time.
@@ -472,7 +472,7 @@ The following table provides a high-level overview of the major schema elements
472472
|:--- |:--- |
473473
| **startTime** | A Date-Time value. For simple schedules, the value of the **startTime** property applies to the first occurrence. For complex schedules, the trigger starts no sooner than the specified **startTime** value. <br> For UTC time zone, format is `'yyyy-MM-ddTHH:mm:ssZ'`, for other time zone, format is `'yyyy-MM-ddTHH:mm:ss'`. |
474474
| **endTime** | The end date and time for the trigger. The trigger doesn't execute after the specified end date and time. The value for the property can't be in the past. This property is optional. <br> For UTC time zone, format is `'yyyy-MM-ddTHH:mm:ssZ'`, for other time zone, format is `'yyyy-MM-ddTHH:mm:ss'`. |
475-
| **timeZone** | The time zone the trigger is created in. This setting impact **startTime**, **endTime**, and **schedule**. See [list of supported time zone](#time-zone-option) |
475+
| **timeZone** | The time zone the trigger is created in. This setting affects **startTime**, **endTime**, and **schedule**. See [list of supported time zone](#time-zone-option) |
476476
| **recurrence** | A recurrence object that specifies the recurrence rules for the trigger. The recurrence object supports the **frequency**, **interval**, **endTime**, **count**, and **schedule** elements. When a recurrence object is defined, the **frequency** element is required. The other elements of the recurrence object are optional. |
477477
| **frequency** | The unit of frequency at which the trigger recurs. The supported values include "minute," "hour," "day," "week," and "month." |
478478
| **interval** | A positive integer that denotes the interval for the **frequency** value, which determines how often the trigger runs. For example, if the **interval** is 3 and the **frequency** is "week," the trigger recurs every 3 weeks. |
@@ -582,5 +582,5 @@ The examples assume that the **interval** value is 1, and that the **frequency**
582582

583583
## Next steps
584584

585-
- For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution).
586-
- Learn how to reference trigger metadata in pipeline, see [Reference Trigger Metadata in Pipeline Runs](how-to-use-trigger-parameterization.md)
585+
- For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution-with-json).
586+
- Learn how to reference trigger metadata in pipeline, see [Reference Trigger Metadata in Pipeline Runs](how-to-use-trigger-parameterization.md)

articles/data-factory/how-to-create-tumbling-window-trigger.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -363,6 +363,6 @@ To monitor trigger runs and pipeline runs in the Azure portal, see [Monitor pipe
363363

364364
## Next steps
365365

366-
* For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution).
366+
* For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution-with-json).
367367
* [Create a tumbling window trigger dependency](tumbling-window-trigger-dependency.md).
368368
* Learn how to reference trigger metadata in pipeline, see [Reference Trigger Metadata in Pipeline Runs](how-to-use-trigger-parameterization.md)

0 commit comments

Comments
 (0)