Skip to content

Commit 67fa15a

Browse files
authored
Merge pull request #113347 from djpmsft/docUpdates
adding ir note to data flow activity
2 parents 8107778 + 5821fe9 commit 67fa15a

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

articles/data-factory/control-flow-execute-data-flow-activity.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.workload: data-services
99
ms.topic: conceptual
1010
ms.author: makromer
11-
ms.date: 04/25/2020
11+
ms.date: 04/30/2020
1212
---
1313

1414
# Data Flow activity in Azure Data Factory
@@ -52,7 +52,7 @@ Use the Data Flow activity to transform and move data via mapping data flows. If
5252
Property | Description | Allowed values | Required
5353
-------- | ----------- | -------------- | --------
5454
dataflow | The reference to the Data Flow being executed | DataFlowReference | Yes
55-
integrationRuntime | The compute environment the data flow runs on. If not specified, the auto-resolve Azure Integration runtime will be used | IntegrationRuntimeReference | No
55+
integrationRuntime | The compute environment the data flow runs on. If not specified, the auto-resolve Azure integration runtime will be used. Only integration runtimes of region auto-resolve are supported. | IntegrationRuntimeReference | No
5656
compute.coreCount | The number of cores used in the spark cluster. Can only be specified if the auto-resolve Azure Integration runtime is used | 8, 16, 32, 48, 80, 144, 272 | No
5757
compute.computeType | The type of compute used in the spark cluster. Can only be specified if the auto-resolve Azure Integration runtime is used | "General", "ComputeOptimized", "MemoryOptimized" | No
5858
staging.linkedService | If you're using a SQL DW source or sink, the storage account used for PolyBase staging | LinkedServiceReference | Only if the data flow reads or writes to a SQL DW
@@ -70,13 +70,13 @@ The Core Count and Compute Type properties can be set dynamically to adjust to t
7070

7171
### Data Flow integration runtime
7272

73-
Choose which Integration Runtime to use for your Data Flow activity execution. By default, Data Factory will use the auto-resolve Azure Integration runtime with four worker cores and no time to live (TTL). This IR has a general purpose compute type and runs in the same region as your factory. You can create your own Azure Integration Runtimes that define specific regions, compute type, core counts, and TTL for your data flow activity execution.
73+
Choose which Integration Runtime to use for your Data Flow activity execution. By default, Data Factory will use the auto-resolve Azure Integration runtime with four worker cores and no time to live (TTL). This IR has a general purpose compute type and runs in the same region as your factory. You can create your own Azure Integration Runtimes that define specific regions, compute type, core counts, and TTL for your data flow activity execution. At this time, only integration runtimes of region auto-resolve are supported in the data flow activity.
7474

7575
For pipeline executions, the cluster is a job cluster, which takes several minutes to start up before execution starts. If no TTL is specified, this start-up time is required on every pipeline run. If you specify a TTL, a warm cluster pool will stay active for the time specified after the last execution, resulting in shorter start-up times. For example, if you have a TTL of 60 minutes and run a data flow on it once an hour, the cluster pool will stay active. For more information, see [Azure integration runtime](concepts-integration-runtime.md).
7676

7777
![Azure Integration Runtime](media/data-flow/ir-new.png "Azure Integration Runtime")
7878

79-
> [!NOTE]
79+
> [!IMPORTANT]
8080
> The Integration Runtime selection in the Data Flow activity only applies to *triggered executions* of your pipeline. Debugging your pipeline with data flows runs on the cluster specified in the debug session.
8181

8282
### PolyBase

0 commit comments

Comments
 (0)