Skip to content

Commit ce35810

Browse files
authored
Update control-flow-execute-data-flow-activity.md
1 parent ddbb574 commit ce35810

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

articles/data-factory/control-flow-execute-data-flow-activity.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.workload: data-services
99
ms.topic: conceptual
1010
ms.author: makromer
11-
ms.date: 01/02/2020
11+
ms.date: 03/16/2020
1212
---
1313

1414
# Data Flow activity in Azure Data Factory
@@ -58,6 +58,14 @@ staging.folderPath | If you're using a SQL DW source or sink, the folder path in
5858

5959
![Execute Data Flow](media/data-flow/activity-data-flow.png "Execute Data Flow")
6060

61+
### Dynamically size data flow compute at runtime
62+
63+
The Core Count and Compute Type properties can be set dynamically to adjust to the size of your incoming source data at runtime. Use pipeline activities like Lookup or Get Metadata in order to find the size of the source dataset data. Then, use Add Dynamic Content in the Data Flow acivity properties.
64+
65+
![Dynamic Data Flow](media/data-flow/dyna1.png "Dynamic data flow")
66+
67+
[Here is a brief video tutorial explaining this technique](https://www.youtube.com/watch?v=jWSkJdtiJNM)
68+
6169
### Data Flow integration runtime
6270

6371
Choose which Integration Runtime to use for your Data Flow activity execution. By default, Data Factory will use the auto-resolve Azure Integration runtime with four worker cores and no time to live (TTL). This IR has a general purpose compute type and runs in the same region as your factory. You can create your own Azure Integration Runtimes that define specific regions, compute type, core counts, and TTL for your data flow activity execution.

0 commit comments

Comments
 (0)