Skip to content

Commit 7767c15

Browse files
authored
Merge pull request #82904 from djpmsft/docUpdates
Updating Event Trigger Documentation
2 parents bf52eb4 + 3381fa2 commit 7767c15

File tree

4 files changed

+28
-16
lines changed

4 files changed

+28
-16
lines changed

articles/data-factory/how-to-create-event-trigger.md

Lines changed: 28 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ manager: craigg
1616

1717
This article describes the event-based triggers that you can create in your Data Factory pipelines.
1818

19-
Event-driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption, and reaction to events. Data integration scenarios often require Data Factory customers to trigger pipelines based on events. Data Factory is now integrated with [Azure Event Grid](https://azure.microsoft.com/services/event-grid/), which lets you trigger pipelines on an event.
19+
Event-driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption, and reaction to events. Data integration scenarios often require Data Factory customers to trigger pipelines based on events such as the arrival or deletion of a file in your Azure Storage account. Data Factory is now integrated with [Azure Event Grid](https://azure.microsoft.com/services/event-grid/), which lets you trigger pipelines on an event.
2020

2121
For a ten-minute introduction and demonstration of this feature, watch the following video:
2222

@@ -28,34 +28,46 @@ For a ten-minute introduction and demonstration of this feature, watch the follo
2828
2929
## Data Factory UI
3030

31-
### Create a new event trigger
31+
This section shows you how to create an event trigger within the Azure Data Factory User Interface.
3232

33-
A typical event is the arrival of a file, or the deletion of a file, in your Azure Storage account. You can create a trigger that responds to this event in your Data Factory pipeline.
33+
1. Go to the **Authoring Canvas**
3434

35-
> [!NOTE]
36-
> This integration supports only version 2 Storage accounts (General purpose).
35+
2. In the bottom-left corner, click on the **Triggers** button
36+
37+
3. Click **+ New** which will open up the create trigger side nav
38+
39+
4. Select trigger type **Event**
3740

3841
![Create new event trigger](media/how-to-create-event-trigger/event-based-trigger-image1.png)
3942

40-
### Configure the event trigger
43+
5. Select your storage account from the Azure subscription dropdown or manually using its Storage account resource ID. Choose which container you wish the events to occur on. Container selection is optional, but be mindful that selecting all containers can lead to a large number of events.
44+
45+
> [!NOTE]
46+
> The Event Trigger currently supports only version 2 Storage accounts (General purpose).
47+
48+
> [!NOTE]
49+
> Due to an Azure Event Grid limitation, Azure Data Factory only supports a maximum of 500 event triggers per storage account.
50+
51+
6. The **Blob path begins with** and **Blob path ends with** properties allow you to specify the containers, folders, and blob names for which you want to receive events. Your event trigger requires at least one of these properties to be defined. You can use variety of patterns for both **Blob path begins with** and **Blob path ends with** properties, as shown in the examples later in this article.
4152

42-
With the **Blob path begins with** and **Blob path ends with** properties, you can specify the containers, folders, and blob names for which you want to receive events. You can use variety of patterns for both **Blob path begins with** and **Blob path ends with** properties, as shown in the examples later in this article. At least one of these properties is required.
53+
* **Blob path begins with:** The blob path must start with a folder path. Valid values include `2018/` and `2018/april/shoes.csv`. This field can't be selected if a container isn't selected.
54+
* **Blob path ends with:** The blob path must end with a file name or extension. Valid values include `shoes.csv` and `.csv`. Container and folder name are optional but, when specified, they must be separated by a `/blobs/` segment. For example, a container named 'orders' can have a value of `/orders/blobs/2018/april/shoes.csv`. To specify a folder in any container, omit the leading '/' character. For example, `april/shoes.csv` will trigger an event on any file named `shoes.csv` in folder a called 'april' in any container.
4355

44-
![Configure the event trigger](media/how-to-create-event-trigger/event-based-trigger-image2.png)
56+
7. Select whether your trigger will respond to a **Blob created** event, **Blob deleted** event, or both. In your specified storage location, each event will trigger the Data Factory pipelines associated with the trigger.
4557

46-
### Select the event trigger type
58+
![Configure the event trigger](media/how-to-create-event-trigger/event-based-trigger-image2.png)
4759

48-
As soon as the file arrives in your storage location and the corresponding blob is created, this event triggers and runs your Data Factory pipeline. You can create a trigger that responds to a blob creation event, a blob deletion event, or both events, in your Data Factory pipelines.
60+
8. Once you've configured you trigger, click on **Next: Data preview**. This screen shows the existing blobs matched by your event trigger configuration. Make sure you've specific filters. Configuring filters that are too broad can match a large number of files created/deleted and may significantly impact your cost. Once your filter conditions have been verified, click **Finish**.
4961

50-
![Select trigger type as event](media/how-to-create-event-trigger/event-based-trigger-image3.png)
62+
![Event trigger data preview](media/how-to-create-event-trigger/event-based-trigger-image3.png)
5163

52-
### Map trigger properties to pipeline parameters
64+
9. To attach a pipeline to this trigger, go to the pipeline canvas and click **Add trigger** and select **New/Edit**. When the side nav appears, click on the **Choose trigger...** dropdown and select the trigger you created. Click **Next: Data preview** to confirm the configuration is correct and then **Next** to validate the Data preview is correct.
5365

54-
When an event trigger fires for a specific blob, the event captures the folder path and file name of the blob into the properties `@triggerBody().folderPath` and `@triggerBody().fileName`. To use the values of these properties in a pipeline, you must map the properties to pipeline parameters. After mapping the properties to parameters, you can access the values captured by the trigger through the `@pipeline().parameters.parameterName` expression throughout the pipeline.
66+
10. If your pipeline has parameters, you can specify them on the trigger runs parameter side nav. The event trigger captures the folder path and file name of the blob into the properties `@triggerBody().folderPath` and `@triggerBody().fileName`. To use the values of these properties in a pipeline, you must map the properties to pipeline parameters. After mapping the properties to parameters, you can access the values captured by the trigger through the `@pipeline().parameters.parameterName` expression throughout the pipeline. Click **Finish** once you are done.
5567

56-
![Mapping properties to pipeline parameters](media/how-to-create-event-trigger/event-based-trigger-image4.png)
68+
![Mapping properties to pipeline parameters](media/how-to-create-event-trigger/event-based-trigger-image4.png)
5769

58-
For example, in the preceding screenshot. the trigger is configured to fire when a blob path ending in `.csv` is created in the Storage Account. As a result, when a blob with the `.csv` extension is created anywhere in the Storage Account, the `folderPath` and `fileName` properties capture the location of the new blob. For example, `@triggerBody().folderPath` has a value like `/containername/foldername/nestedfoldername` and `@triggerBody().fileName` has a value like `filename.csv`. These values are mapped in the example to the pipeline parameters `sourceFolder` and `sourceFile`. You can use them throughout the pipeline as `@pipeline().parameters.sourceFolder` and `@pipeline().parameters.sourceFile` respectively.
70+
In the preceding example, the trigger is configured to fire when a blob path ending in .csv is created in the folder event-testing in the container sample-data. The **folderPath** and **fileName** properties capture the location of the new blob. For example, when MoviesDB.csv is added to the path sample-data/event-testing, `@triggerBody().folderPath` has a value of `sample-data/event-testing` and `@triggerBody().fileName` has a value of `moviesDB.csv`. These values are mapped in the example to the pipeline parameters `sourceFolder` and `sourceFile` which can be used throughout the pipeline as `@pipeline().parameters.sourceFolder` and `@pipeline().parameters.sourceFile` respectively.
5971

6072
## JSON schema
6173

@@ -73,7 +85,7 @@ The following table provides an overview of the schema elements that are related
7385
This section provides examples of event-based trigger settings.
7486

7587
> [!IMPORTANT]
76-
> You have to include the `/blobs/` segment of the path, as shown in the following examples, whenever you specify container and folder, container and file, or container, folder, and file.
88+
> You have to include the `/blobs/` segment of the path, as shown in the following examples, whenever you specify container and folder, container and file, or container, folder, and file. For **blobPathBeginsWith**, the Data Factory UI will automatically add `/blobs/` between the folder and container name in the trigger JSON.
7789
7890
| Property | Example | Description |
7991
|---|---|---|
38.2 KB
Loading
15.8 KB
Loading
-11.4 KB
Loading

0 commit comments

Comments
 (0)