You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/author-management-hub.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -51,7 +51,7 @@ To override the generated Resource Manager template parameters when publishing f
51
51
52
52
### Triggers
53
53
54
-
Triggers determine when a pipeline run should be kicked off. Currently triggers can be on a wall clock schedule, operate on a periodic interval, or depend on an event. For more information, learn about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution). In the management hub, you can create, edit, delete, or view the current state of a trigger.
54
+
Triggers determine when a pipeline run should be kicked off. Currently triggers can be on a wall clock schedule, operate on a periodic interval, or depend on an event. For more information, learn about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution-with-json). In the management hub, you can create, edit, delete, or view the current state of a trigger.
55
55
56
56
:::image type="content" source="media/author-management-hub/management-hub-triggers.png" alt-text="Screenshot that shows where to create, edit, delete, nor view the current state of a trigger.":::
Copy file name to clipboardExpand all lines: articles/data-factory/compare-versions.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ The following table compares the features of Data Factory with the features of D
19
19
20
20
| Feature | Version 1 | Current version |
21
21
| ------- | --------- | --------- |
22
-
| Datasets | A named view of data that references the data that you want to use in your activities as inputs and outputs. Datasets identify data within different data stores, such as tables, files, folders, and documents. For example, an Azure Blob dataset specifies the blob container and folder in Azure Blob storage from which the activity should read the data.<br/><br/>**Availability** defines the processing window slicing model for the dataset (for example, hourly, daily, and so on). | Datasets are the same in the current version. However, you do not need to define **availability** schedules for datasets. You can define a trigger resource that can schedule pipelines from a clock scheduler paradigm. For more information, see [Triggers](concepts-pipeline-execution-triggers.md#trigger-execution) and [Datasets](concepts-datasets-linked-services.md). |
22
+
| Datasets | A named view of data that references the data that you want to use in your activities as inputs and outputs. Datasets identify data within different data stores, such as tables, files, folders, and documents. For example, an Azure Blob dataset specifies the blob container and folder in Azure Blob storage from which the activity should read the data.<br/><br/>**Availability** defines the processing window slicing model for the dataset (for example, hourly, daily, and so on). | Datasets are the same in the current version. However, you do not need to define **availability** schedules for datasets. You can define a trigger resource that can schedule pipelines from a clock scheduler paradigm. For more information, see [Triggers](concepts-pipeline-execution-triggers.md#trigger-execution-with-json) and [Datasets](concepts-datasets-linked-services.md). |
23
23
| Linked services | Linked services are much like connection strings, which define the connection information that's necessary for Data Factory to connect to external resources. | Linked services are the same as in Data Factory V1, but with a new **connectVia** property to utilize the Integration Runtime compute environment of the current version of Data Factory. For more information, see [Integration runtime in Azure Data Factory](concepts-integration-runtime.md) and [Linked service properties for Azure Blob storage](connector-azure-blob-storage.md#linked-service-properties). |
24
24
| Pipelines | A data factory can have one or more pipelines. A pipeline is a logical grouping of activities that together perform a task. You use startTime, endTime, and isPaused to schedule and run pipelines. | Pipelines are groups of activities that are performed on data. However, the scheduling of activities in the pipeline has been separated into new trigger resources. You can think of pipelines in the current version of Data Factory more as "workflow units" that you schedule separately via triggers. <br/><br/>Pipelines do not have "windows" of time execution in the current version of Data Factory. The Data Factory V1 concepts of startTime, endTime, and isPaused are no longer present in the current version of Data Factory. For more information, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md) and [Pipelines and activities](concepts-pipelines-activities.md). |
25
25
| Activities | Activities define actions to perform on your data within a pipeline. Data movement (copy activity) and data transformation activities (such as Hive, Pig, and MapReduce) are supported. | In the current version of Data Factory, activities still are defined actions within a pipeline. The current version of Data Factory introduces new [control flow activities](concepts-pipelines-activities.md#control-flow-activities). You use these activities in a control flow (looping and branching). Data movement and data transformation activities that were supported in V1 are supported in the current version. You can define transformation activities without using datasets in the current version. |
@@ -139,4 +139,4 @@ in the current version, you can also monitor data factories by using [Azure Moni
139
139
140
140
141
141
## Next steps
142
-
Learn how to create a data factory by following step-by-step instructions in the following quickstarts: [PowerShell](quickstart-create-data-factory-powershell.md), [.NET](quickstart-create-data-factory-dot-net.md), [Python](quickstart-create-data-factory-python.md), [REST API](quickstart-create-data-factory-rest-api.md).
142
+
Learn how to create a data factory by following step-by-step instructions in the following quickstarts: [PowerShell](quickstart-create-data-factory-powershell.md), [.NET](quickstart-create-data-factory-dot-net.md), [Python](quickstart-create-data-factory-python.md), [REST API](quickstart-create-data-factory-rest-api.md).
Copy file name to clipboardExpand all lines: articles/data-factory/concepts-pipeline-execution-triggers.md
+24-7Lines changed: 24 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.reviewer: jburchel
8
8
ms.service: data-factory
9
9
ms.subservice: orchestration
10
10
ms.topic: conceptual
11
-
ms.date: 09/09/2021
11
+
ms.date: 01/27/2022
12
12
ms.custom: devx-track-azurepowershell, synapse
13
13
---
14
14
@@ -23,7 +23,24 @@ A _pipeline run_ in Azure Data Factory and Azure Synapse defines an instance of
23
23
24
24
Pipeline runs are typically instantiated by passing arguments to parameters that you define in the pipeline. You can execute a pipeline either manually or by using a _trigger_. This article provides details about both ways of executing a pipeline.
25
25
26
-
## Manual execution (on-demand)
26
+
## Create triggers with UI
27
+
28
+
To manually trigger a pipeline or configure a new scheduled, tumbling window, storage event, or custom event trigger, select Add trigger at the top of the pipeline editor.
29
+
30
+
:::image type="content" source="media/concepts-pipeline-execution-triggers/manual-trigger.png" alt-text="Shows how to add a new trigger with UI from the pipeline editor.":::
31
+
32
+
If you choose to manually trigger the pipeline, it will execute immediately. Otherwise if you choose New/Edit, you will be prompted with the add triggers window to either choose an existing trigger to edit, or create a new trigger.
33
+
34
+
:::image type="content" source="media/concepts-pipeline-execution-triggers/new-trigger.png" alt-text="Shows the add triggers window highlighting where to create a new trigger.":::
35
+
36
+
You will see the trigger configuration window, allowing you to choose the trigger type.
37
+
38
+
:::image type="content" source="media/concepts-pipeline-execution-triggers/new-trigger-configuration.png" alt-text="Shows the new trigger configuration window with the type dropdown showing the various types of triggers you can create.":::
39
+
40
+
Read more about [scheduled](#schedule-trigger-with-json), [tumbling window](#tumbling-window-trigger), [storage event](#event-based-trigger), and [custom event](#event-based-trigger) triggers below.
41
+
42
+
43
+
## Manual execution (on-demand) with JSON
27
44
28
45
The manual execution of a pipeline is also referred to as _on-demand_ execution.
29
46
@@ -132,7 +149,7 @@ For a complete sample, see [Quickstart: Create a data factory by using the .NET
132
149
> [!NOTE]
133
150
> You can use the .NET SDK to invoke pipelines from Azure Functions, from your web services, and so on.
134
151
135
-
## Trigger execution
152
+
## Trigger execution with JSON
136
153
137
154
Triggers are another way that you can execute a pipeline run. Triggers represent a unit of processing that determines when a pipeline execution needs to be kicked off. Currently, the service supports three types of triggers:
138
155
@@ -142,7 +159,7 @@ Triggers are another way that you can execute a pipeline run. Triggers represent
142
159
143
160
- Event-based trigger: A trigger that responds to an event.
144
161
145
-
Pipelines and triggers have a many-to-many relationship (except for the tumbling window trigger).Multiple triggers can kick off a single pipeline, or a single trigger can kick off multiple pipelines. In the following trigger definition, the **pipelines** property refers to a list of pipelines that are triggered by the particular trigger. The property definition includes values for the pipeline parameters.
162
+
Pipelines and triggers have a many-to-many relationship (except for the tumbling window trigger).Multiple triggers can kick off a single pipeline, or a single trigger can kick off multiple pipelines. In the following trigger definition, the **pipelines** property refers to a list of pipelines that are triggered by the particular trigger. The property definition includes values for the pipeline parameters.
146
163
### Basic trigger definition
147
164
148
165
```json
@@ -170,7 +187,7 @@ Pipelines and triggers have a many-to-many relationship (except for the tumbling
170
187
}
171
188
```
172
189
173
-
## Schedule trigger
190
+
## Schedule trigger with JSON
174
191
A schedule trigger runs pipelines on a wall-clock schedule. This trigger supports periodic and advanced calendar options. For example, the trigger supports intervals like "weekly" or "Monday at 5:00 PM and Thursday at 9:00 PM." The schedule trigger is flexible because the dataset pattern is agnostic, and the trigger doesn't discern between time-series and non-time-series data.
175
192
176
193
For more information about schedule triggers and, for examples, see [Create a trigger that runs a pipeline on a schedule](how-to-create-schedule-trigger.md).
@@ -376,10 +393,10 @@ The following table provides a comparison of the tumbling window trigger and sch
376
393
377
394
## Event-based trigger
378
395
379
-
An event-based trigger runs pipelines in response to an event. There are two flavors of eventbased triggers.
396
+
An event-based trigger runs pipelines in response to an event. There are two flavors of event-based triggers.
380
397
381
398
*_Storage event trigger_ runs a pipeline against events happening in a Storage account, such as the arrival of a file, or the deletion of a file in Azure Blob Storage account.
382
-
*_Custom event trigger_ processes and handles [custom topics](../event-grid/custom-topics.md) in Event Grid
399
+
*_Custom event trigger_ processes and handles [custom articles](../event-grid/custom-topics.md) in Event Grid
383
400
384
401
For more information about event-based triggers, see [Storage Event Trigger](how-to-create-event-trigger.md) and [Custom Event Trigger](how-to-create-custom-event-trigger.md).
Copy file name to clipboardExpand all lines: articles/data-factory/control-flow-system-variables.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -40,7 +40,7 @@ These system variables can be referenced anywhere in the pipeline JSON.
40
40
41
41
## Schedule trigger scope
42
42
43
-
These system variables can be referenced anywhere in the trigger JSON for triggers of type [ScheduleTrigger](concepts-pipeline-execution-triggers.md#schedule-trigger).
43
+
These system variables can be referenced anywhere in the trigger JSON for triggers of type [ScheduleTrigger](concepts-pipeline-execution-triggers.md#schedule-trigger-with-json).
44
44
45
45
| Variable Name | Description |
46
46
| --- | --- |
@@ -80,7 +80,7 @@ These system variables can be referenced anywhere in the trigger JSON for trigge
80
80
81
81
| Variable Name | Description
82
82
| --- | --- |
83
-
|@triggerBody().event.eventType | Type of events that triggered the Custom Event Trigger run. Event type is customerdefined field and take on any values of string type. |
83
+
|@triggerBody().event.eventType | Type of events that triggered the Custom Event Trigger run. Event type is customer-defined field and take on any values of string type. |
84
84
|@triggerBody().event.subject | Subject of the custom event that caused the trigger to fire. |
85
85
|@triggerBody().event.data._keyName_| Data field in custom event is a free from JSON blob, which customer can use to send messages and data. Please use data._keyName_ to reference each field. For example, @triggerBody().event.data.callback returns the value for the _callback_ field stored under _data_. |
86
86
|@trigger().startTime | Time at which the trigger fired to invoke the pipeline run. |
Copy file name to clipboardExpand all lines: articles/data-factory/how-to-create-custom-event-trigger.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -110,9 +110,9 @@ As of today custom event trigger supports a __subset__ of [advanced filtering op
110
110
* StringIn
111
111
* StringNotIn
112
112
113
-
Click**+New** to add new filter conditions.
113
+
Select**+New** to add new filter conditions.
114
114
115
-
Additionally, custom event triggers obey the [same limitations as event grid](../event-grid/event-filtering.md#limitations), including:
115
+
Additionally, custom event triggers obey the [same limitations as Event Grid](../event-grid/event-filtering.md#limitations), including:
116
116
117
117
* 5 advanced filters and 25 filter values across all the filters per custom event trigger
118
118
* 512 characters per string value
@@ -128,7 +128,7 @@ The following table provides an overview of the schema elements that are related
128
128
129
129
| JSON element | Description | Type | Allowed values | Required |
130
130
|---|----------------------------|---|---|---|
131
-
|`scope`| The Azure Resource Manager resource ID of the event grid topic. | String | Azure Resource Manager ID | Yes |
131
+
|`scope`| The Azure Resource Manager resource ID of the Event Grid topic. | String | Azure Resource Manager ID | Yes |
132
132
|`events`| The type of events that cause this trigger to fire. | Array of strings || Yes, at least one value is expected. |
133
133
|`subjectBeginsWith`| The `subject` field must begin with the provided pattern for the trigger to fire. For example, _factories_ only fire the trigger for event subjects that start with *factories*. | String || No |
134
134
|`subjectEndsWith`| The `subject` field must end with the provided pattern for the trigger to fire. | String || No |
@@ -143,11 +143,11 @@ Azure Data Factory uses Azure role-based access control (RBAC) to prohibit unaut
143
143
144
144
To successfully create or update a custom event trigger, you need to sign in to Data Factory with an Azure account that has appropriate access. Otherwise, the operation will fail with an _Access Denied_ error.
145
145
146
-
Data Factory doesn't require special permission to your Event Grid. You also do *not* need to assign special Azure RBAC permission to the Data Factory service principal for the operation.
146
+
Data Factory doesn't require special permission to your Event Grid. You also do *not* need to assign special Azure RBAC role permission to the Data Factory service principal for the operation.
147
147
148
148
Specifically, you need `Microsoft.EventGrid/EventSubscriptions/Write` permission on `/subscriptions/####/resourceGroups//####/providers/Microsoft.EventGrid/topics/someTopics`.
149
149
150
150
## Next steps
151
151
152
-
* Get detailed information about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution).
152
+
* Get detailed information about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution-with-json).
153
153
* Learn how to [reference trigger metadata in pipeline runs](how-to-use-trigger-parameterization.md).
Copy file name to clipboardExpand all lines: articles/data-factory/how-to-create-schedule-trigger.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -206,7 +206,7 @@ This section shows you how to use Azure CLI to create, start, and monitor a sche
206
206
207
207
### Sample Code
208
208
209
-
1. In your working direactory, create a JSON file named **MyTrigger.json** with the trigger's properties. For this example use the following content:
209
+
1. In your working directory, create a JSON file named **MyTrigger.json** with the trigger's properties. For this example use the following content:
210
210
211
211
> [!IMPORTANT]
212
212
> Before you save the JSON file, set the value of the **startTime** element to the current UTC time. Set the value of the **endTime** element to one hour past the current UTC time.
@@ -472,7 +472,7 @@ The following table provides a high-level overview of the major schema elements
472
472
|:--- |:--- |
473
473
|**startTime**| A Date-Time value. For simple schedules, the value of the **startTime** property applies to the first occurrence. For complex schedules, the trigger starts no sooner than the specified **startTime** value. <br> For UTC time zone, format is `'yyyy-MM-ddTHH:mm:ssZ'`, for other time zone, format is `'yyyy-MM-ddTHH:mm:ss'`. |
474
474
|**endTime**| The end date and time for the trigger. The trigger doesn't execute after the specified end date and time. The value for the property can't be in the past. This property is optional. <br> For UTC time zone, format is `'yyyy-MM-ddTHH:mm:ssZ'`, for other time zone, format is `'yyyy-MM-ddTHH:mm:ss'`. |
475
-
|**timeZone**| The time zone the trigger is created in. This setting impact**startTime**, **endTime**, and **schedule**. See [list of supported time zone](#time-zone-option)|
475
+
|**timeZone**| The time zone the trigger is created in. This setting affects**startTime**, **endTime**, and **schedule**. See [list of supported time zone](#time-zone-option)|
476
476
|**recurrence**| A recurrence object that specifies the recurrence rules for the trigger. The recurrence object supports the **frequency**, **interval**, **endTime**, **count**, and **schedule** elements. When a recurrence object is defined, the **frequency** element is required. The other elements of the recurrence object are optional. |
477
477
|**frequency**| The unit of frequency at which the trigger recurs. The supported values include "minute," "hour," "day," "week," and "month." |
478
478
|**interval**| A positive integer that denotes the interval for the **frequency** value, which determines how often the trigger runs. For example, if the **interval** is 3 and the **frequency** is "week," the trigger recurs every 3 weeks. |
@@ -582,5 +582,5 @@ The examples assume that the **interval** value is 1, and that the **frequency**
582
582
583
583
## Next steps
584
584
585
-
- For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution).
586
-
- Learn how to reference trigger metadata in pipeline, see [Reference Trigger Metadata in Pipeline Runs](how-to-use-trigger-parameterization.md)
585
+
- For detailed information about triggers, see [Pipeline execution and triggers](concepts-pipeline-execution-triggers.md#trigger-execution-with-json).
586
+
- Learn how to reference trigger metadata in pipeline, see [Reference Trigger Metadata in Pipeline Runs](how-to-use-trigger-parameterization.md)
0 commit comments