You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/concepts-pipeline-execution-triggers.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -159,7 +159,7 @@ Triggers are another way that you can execute a pipeline run. Triggers represent
159
159
160
160
- Event-based trigger: A trigger that responds to an event.
161
161
162
-
Pipelines and triggers have a many-to-many relationship (except for the tumbling window trigger).Multiple triggers can kick off a single pipeline, or a single trigger can kick off multiple pipelines. In the following trigger definition, the **pipelines** property refers to a list of pipelines that are triggered by the particular trigger. The property definition includes values for the pipeline parameters.
162
+
Pipelines and triggers have a many-to-many relationship (except for the tumbling window trigger).Multiple triggers can kick off a single pipeline, or a single trigger can kick off multiple pipelines. In the following trigger definition, the **pipelines** property refers to a list of pipelines that are triggered by the particular trigger. The property definition includes values for the pipeline parameters.
163
163
### Basic trigger definition
164
164
165
165
```json
@@ -393,10 +393,10 @@ The following table provides a comparison of the tumbling window trigger and sch
393
393
394
394
## Event-based trigger
395
395
396
-
An event-based trigger runs pipelines in response to an event. There are two flavors of eventbased triggers.
396
+
An event-based trigger runs pipelines in response to an event. There are two flavors of event-based triggers.
397
397
398
398
*_Storage event trigger_ runs a pipeline against events happening in a Storage account, such as the arrival of a file, or the deletion of a file in Azure Blob Storage account.
399
-
*_Custom event trigger_ processes and handles [custom topics](../event-grid/custom-topics.md) in Event Grid
399
+
*_Custom event trigger_ processes and handles [custom articles](../event-grid/custom-topics.md) in Event Grid
400
400
401
401
For more information about event-based triggers, see [Storage Event Trigger](how-to-create-event-trigger.md) and [Custom Event Trigger](how-to-create-custom-event-trigger.md).
Copy file name to clipboardExpand all lines: articles/data-factory/control-flow-system-variables.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -80,7 +80,7 @@ These system variables can be referenced anywhere in the trigger JSON for trigge
80
80
81
81
| Variable Name | Description
82
82
| --- | --- |
83
-
|@triggerBody().event.eventType | Type of events that triggered the Custom Event Trigger run. Event type is customerdefined field and take on any values of string type. |
83
+
|@triggerBody().event.eventType | Type of events that triggered the Custom Event Trigger run. Event type is customer-defined field and take on any values of string type. |
84
84
|@triggerBody().event.subject | Subject of the custom event that caused the trigger to fire. |
85
85
|@triggerBody().event.data._keyName_| Data field in custom event is a free from JSON blob, which customer can use to send messages and data. Please use data._keyName_ to reference each field. For example, @triggerBody().event.data.callback returns the value for the _callback_ field stored under _data_. |
86
86
|@trigger().startTime | Time at which the trigger fired to invoke the pipeline run. |
Copy file name to clipboardExpand all lines: articles/data-factory/how-to-create-custom-event-trigger.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -110,9 +110,9 @@ As of today custom event trigger supports a __subset__ of [advanced filtering op
110
110
* StringIn
111
111
* StringNotIn
112
112
113
-
Click**+New** to add new filter conditions.
113
+
Select**+New** to add new filter conditions.
114
114
115
-
Additionally, custom event triggers obey the [same limitations as event grid](../event-grid/event-filtering.md#limitations), including:
115
+
Additionally, custom event triggers obey the [same limitations as Event Grid](../event-grid/event-filtering.md#limitations), including:
116
116
117
117
* 5 advanced filters and 25 filter values across all the filters per custom event trigger
118
118
* 512 characters per string value
@@ -128,7 +128,7 @@ The following table provides an overview of the schema elements that are related
128
128
129
129
| JSON element | Description | Type | Allowed values | Required |
130
130
|---|----------------------------|---|---|---|
131
-
|`scope`| The Azure Resource Manager resource ID of the event grid topic. | String | Azure Resource Manager ID | Yes |
131
+
|`scope`| The Azure Resource Manager resource ID of the Event Grid topic. | String | Azure Resource Manager ID | Yes |
132
132
|`events`| The type of events that cause this trigger to fire. | Array of strings || Yes, at least one value is expected. |
133
133
|`subjectBeginsWith`| The `subject` field must begin with the provided pattern for the trigger to fire. For example, _factories_ only fire the trigger for event subjects that start with *factories*. | String || No |
134
134
|`subjectEndsWith`| The `subject` field must end with the provided pattern for the trigger to fire. | String || No |
@@ -143,7 +143,7 @@ Azure Data Factory uses Azure role-based access control (RBAC) to prohibit unaut
143
143
144
144
To successfully create or update a custom event trigger, you need to sign in to Data Factory with an Azure account that has appropriate access. Otherwise, the operation will fail with an _Access Denied_ error.
145
145
146
-
Data Factory doesn't require special permission to your Event Grid. You also do *not* need to assign special Azure RBAC permission to the Data Factory service principal for the operation.
146
+
Data Factory doesn't require special permission to your Event Grid. You also do *not* need to assign special Azure RBAC role permission to the Data Factory service principal for the operation.
147
147
148
148
Specifically, you need `Microsoft.EventGrid/EventSubscriptions/Write` permission on `/subscriptions/####/resourceGroups//####/providers/Microsoft.EventGrid/topics/someTopics`.
Copy file name to clipboardExpand all lines: articles/data-factory/how-to-create-schedule-trigger.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -206,7 +206,7 @@ This section shows you how to use Azure CLI to create, start, and monitor a sche
206
206
207
207
### Sample Code
208
208
209
-
1. In your working direactory, create a JSON file named **MyTrigger.json** with the trigger's properties. For this example use the following content:
209
+
1. In your working directory, create a JSON file named **MyTrigger.json** with the trigger's properties. For this example use the following content:
210
210
211
211
> [!IMPORTANT]
212
212
> Before you save the JSON file, set the value of the **startTime** element to the current UTC time. Set the value of the **endTime** element to one hour past the current UTC time.
@@ -252,31 +252,31 @@ This section shows you how to use Azure CLI to create, start, and monitor a sche
252
252
- The trigger is associated with the **Adfv2QuickStartPipeline** pipeline. To associate multiple pipelines with a trigger, add more **pipelineReference** sections.
253
253
- The pipeline in the Quickstart takes two **parameters** values: **inputPath** and **outputPath**. And you pass values for these parameters from the trigger.
254
254
255
-
1. Create a trigger by using the [az datafactory trigger create](/cli/azure/datafactory/trigger#az_datafactory_trigger_create) command:
255
+
1. Create a trigger by using the [az data factory trigger create](/cli/azure/datafactory/trigger#az_datafactory_trigger_create) command:
1. Confirm that the status of the trigger is **Stopped** by using the [az datafactory trigger show](/cli/azure/datafactory/trigger#az_datafactory_trigger_show) command:
261
+
1. Confirm that the status of the trigger is **Stopped** by using the [az data factory trigger show](/cli/azure/datafactory/trigger#az_datafactory_trigger_show) command:
262
262
263
263
```azurecli
264
264
az datafactory trigger show --resource-group "ADFQuickStartRG" --factory-name "ADFTutorialFactory" --name "MyTrigger"
265
265
```
266
266
267
-
1. Start the trigger by using the [az datafactory trigger start](/cli/azure/datafactory/trigger#az_datafactory_trigger_start) command:
267
+
1. Start the trigger by using the [az data factory trigger start](/cli/azure/datafactory/trigger#az_datafactory_trigger_start) command:
268
268
269
269
```azurecli
270
270
az datafactory trigger start --resource-group "ADFQuickStartRG" --factory-name "ADFTutorialFactory" --name "MyTrigger"
271
271
```
272
272
273
-
1. Confirm that the status of the trigger is **Started** by using the [az datafactory trigger show](/cli/azure/datafactory/trigger#az_datafactory_trigger_show) command:
273
+
1. Confirm that the status of the trigger is **Started** by using the [az data factory trigger show](/cli/azure/datafactory/trigger#az_datafactory_trigger_show) command:
274
274
275
275
```azurecli
276
276
az datafactory trigger show --resource-group "ADFQuickStartRG" --factory-name "ADFTutorialFactory" --name "MyTrigger"
277
277
```
278
278
279
-
1. Get the trigger runs in Azure CLI by using the [az datafactory trigger-run query-by-factory](/cli/azure/datafactory/trigger-run#az_datafactory_trigger_run_query_by_factory) command. To get information about the trigger runs, execute the following command periodically. Update the **last-updated-after** and **last-updated-before** values to match the values in your trigger definition:
279
+
1. Get the trigger runs in Azure CLI by using the [az data factory trigger-run query-by-factory](/cli/azure/datafactory/trigger-run#az_datafactory_trigger_run_query_by_factory) command. To get information about the trigger runs, execute the following command periodically. Update the **last-updated-after** and **last-updated-before** values to match the values in your trigger definition:
@@ -472,7 +472,7 @@ The following table provides a high-level overview of the major schema elements
472
472
|:--- |:--- |
473
473
|**startTime**| A Date-Time value. For simple schedules, the value of the **startTime** property applies to the first occurrence. For complex schedules, the trigger starts no sooner than the specified **startTime** value. <br> For UTC time zone, format is `'yyyy-MM-ddTHH:mm:ssZ'`, for other time zone, format is `'yyyy-MM-ddTHH:mm:ss'`. |
474
474
|**endTime**| The end date and time for the trigger. The trigger doesn't execute after the specified end date and time. The value for the property can't be in the past. This property is optional. <br> For UTC time zone, format is `'yyyy-MM-ddTHH:mm:ssZ'`, for other time zone, format is `'yyyy-MM-ddTHH:mm:ss'`. |
475
-
|**timeZone**| The time zone the trigger is created in. This setting impact**startTime**, **endTime**, and **schedule**. See [list of supported time zone](#time-zone-option)|
475
+
|**timeZone**| The time zone the trigger is created in. This setting affects**startTime**, **endTime**, and **schedule**. See [list of supported time zone](#time-zone-option)|
476
476
|**recurrence**| A recurrence object that specifies the recurrence rules for the trigger. The recurrence object supports the **frequency**, **interval**, **endTime**, **count**, and **schedule** elements. When a recurrence object is defined, the **frequency** element is required. The other elements of the recurrence object are optional. |
477
477
|**frequency**| The unit of frequency at which the trigger recurs. The supported values include "minute," "hour," "day," "week," and "month." |
478
478
|**interval**| A positive integer that denotes the interval for the **frequency** value, which determines how often the trigger runs. For example, if the **interval** is 3 and the **frequency** is "week," the trigger recurs every 3 weeks. |
Copy file name to clipboardExpand all lines: articles/data-factory/how-to-use-trigger-parameterization.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,15 +29,15 @@ This section shows you how to pass meta data information from trigger to pipelin
29
29
30
30
1. Go to the **Authoring Canvas** and edit a pipeline
31
31
32
-
1.Click on the blank canvas to bring up pipeline settings. Do not select any activity. You may need to pull up the setting panel from the bottom of the canvas, as it may have been collapsed
32
+
1.Select on the blank canvas to bring up pipeline settings. Don’t select any activity. You may need to pull up the setting panel from the bottom of the canvas, as it may have been collapsed
33
33
34
34
1. Select **Parameters** section and select **+ New** to add parameters
35
35
36
36
:::image type="content" source="media/how-to-use-trigger-parameterization/01-create-parameter.png" alt-text="Screen shot of pipeline setting showing how to define parameters in pipeline.":::
37
37
38
38
1. Add triggers to pipeline, by clicking on **+ Trigger**.
39
39
40
-
1. Create or attach a trigger to the pipeline, and click**OK**
40
+
1. Create or attach a trigger to the pipeline, and select**OK**
41
41
42
42
1. In the following page, fill in trigger meta data for each parameter. Use format defined in [System Variable](control-flow-system-variables.md) to retrieve trigger information. You don't need to fill in the information for all parameters, just the ones that will assume trigger metadata values. For instance, here we assign trigger run start time to *parameter_1*.
0 commit comments