Skip to content

Commit 5f98419

Browse files
authored
Update azure-data-factory-limits.md
Acrolinx updates
1 parent 9bca020 commit 5f98419

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

includes/azure-data-factory-limits.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Azure Data Factory is a multitenant service that has the following default limit
1515
| -------- | ------------- | ------------- |
1616
| Total number of entities, such as pipelines, data sets, triggers, linked services, Private Endpoints, and integration runtimes, within a data factory | 5,000 |5,000 |
1717
| Total CPU cores for Azure-SSIS Integration Runtimes under one subscription | 64 | [Find out how to request a quota increase from support](https://azure.microsoft.com/blog/azure-limits-quotas-increase-requests/). |
18-
| Concurrent pipeline runs per data factory that's shared among all pipelines in the factory | 10,000 | 10,000 |
18+
| Concurrent pipeline runs per data factory shared among all pipelines in the factory | 10,000 | 10,000 |
1919
| Concurrent External activity runs per subscription per [Azure Integration Runtime region](../articles/data-factory/concepts-integration-runtime.md#azure-ir-location)<br>External activities are managed on integration runtime but execute on linked services, including Databricks, stored procedure, Web, and others. This limit doesn't apply to Self-hosted IR. | 3,000 | 3,000 |
2020
| Concurrent Pipeline activity runs per subscription per [Azure Integration Runtime region](../articles/data-factory/concepts-integration-runtime.md#azure-ir-location) <br>Pipeline activities execute on integration runtime, including Lookup, GetMetadata, and Delete. This limit doesn't apply to Self-hosted IR. | 1,000 | 1,000 |
2121
| Concurrent authoring operations per subscription per [Azure Integration Runtime region](../articles/data-factory/concepts-integration-runtime.md#azure-ir-location)<br>Including test connection, browse folder list and table list, preview data. This limit doesn't apply to Self-hosted IR. | 200 | 200 |
@@ -43,7 +43,7 @@ Azure Data Factory is a multitenant service that has the following default limit
4343
| Concurrent number of data flows per integration runtime | 50 | 50 |
4444
| Concurrent number of data flows per integration runtime in managed vNet| 50 | 50 |
4545
| Concurrent number of data flow debug sessions per user per factory | 3 | 3 |
46-
| Data Flow Azure IR TTL limit | 4 hrs | 4 hrs |
46+
| Data Flow Azure IR TTL (time to live) limit | 4 hrs | 4 hrs |
4747
| Meta Data Entity Size limit in a factory | 2 GB | 2 GB |
4848

4949
<sup>1</sup> The data integration unit (DIU) is used in a cloud-to-cloud copy operation. Learn more from [Data integration units (version 2)](../articles/data-factory/copy-activity-performance.md#data-integration-units). For information on billing, see [Azure Data Factory pricing](https://azure.microsoft.com/pricing/details/data-factory/).
@@ -60,7 +60,7 @@ If managed virtual network is enabled, the data integration unit (DIU) in all re
6060

6161
<sup>3</sup> Pipeline, data set, and linked service objects represent a logical grouping of your workload. Limits for these objects don't relate to the amount of data you can move and process with Azure Data Factory. Data Factory is designed to scale to handle petabytes of data.
6262

63-
<sup>4</sup> The payload for each activity run includes the activity configuration, the associated dataset(s) and linked service(s) configurations if any, and a small portion of system properties generated per activity type. Limit for this payload size doesn't relate to the amount of data you can move and process with Azure Data Factory. Learn about the [symptoms and recommendation](../articles/data-factory/data-factory-troubleshoot-guide.md#payload-is-too-large) if you hit this limit.
63+
<sup>4</sup> The payload for each activity run includes the activity configuration, one or more associated datasets, and linked service configurations if any, and a small portion of system properties generated per activity type. Limit for this payload size doesn't relate to the amount of data you can move and process with Azure Data Factory. Learn about the [symptoms and recommendation](../articles/data-factory/data-factory-troubleshoot-guide.md#payload-is-too-large) if you hit this limit.
6464

6565
#### Web service call limits
6666
Azure Resource Manager has limits for API calls. You can make API calls at a rate within the [Azure Resource Manager API limits](../articles/azure-resource-manager/management/azure-subscription-service-limits.md#azure-resource-group-limits).

0 commit comments

Comments
 (0)