You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Azure Data Factory is a serverless and elastic data integration service built for cloud scale. This means there is not a fixed-size compute that you need to plan for peak load; rather you specify how much resource to allocate on demand per operation, which allows you to design the ETL processes in a much more scalable manner. In addition, ADF is billed on a consumption-based plan, which means you only pay for what you use.
16
18
17
19
This article describes how you can plan and manage costs for Azure Data Factory.
@@ -32,17 +34,20 @@ One of the commonly asked questions for the pricing calculator is what values sh
32
34
For example, let’s say you need to move 1 TB of data daily from AWS S3 to Azure Data Lake Gen2. You can perform POC of moving 100 GB of data to measure the data ingestion throughput and understand the corresponding billing consumption.
33
35
34
36
Here is a sample copy activity run detail (your actual mileage will vary based on the shape of your specific dataset, network speeds, egress limits on S3 account, ingress limits on ADLS Gen2, and other factors).
By leveraging the [consumption monitoring at pipeline-run level](#monitor-consumption-at-pipeline-run-level), you can see the corresponding data movement meter consumption quantities:
@@ -56,7 +61,8 @@ As you start using Azure Data Factory, you can see the costs incurred in the [co
56
61
2. The default view shows accumulated costs for the current month. You can switch to a different time range and a different granularity such as daily or monthly.
57
62
3. To narrow costs for a single service such as Azure Data Factory, select **Add filter** and then select **Service name**. Then choose **Azure data factory v2** from the list.
58
63
4. You can add additional filters to analyze cost for specific factory instance and specific ADF meter granularity.
@@ -65,8 +71,10 @@ Depending on the types of activities you have in your pipeline, how much data yo
65
71
You can view the amount of consumption for different meters for individual pipeline runs in the Azure Data Factory user experience. To open the monitoring experience, select the **Monitor & Manage** tile in the data factory blade of the [Azure portal](https://portal.azure.com/). If you're already in the ADF UX, click on the **Monitor** icon on the left sidebar. The default monitoring view is list of pipeline runs.
66
72
67
73
Clicking the **Consumption** button next to the pipeline name will display a pop-up window showing you the consumption for your pipeline run aggregated across all of the activities within the pipeline.
68
-

The pipeline run consumption view shows you the amount consumed for each ADF meter for the specific pipeline run, but it does not show the actual price charged, because the amount billed to you is dependent on the type of Azure account you have and the type of currency used. To view the full list of supported account types, see [Understand Cost Management data](https://docs.microsoft.com/azure/cost-management-billing/costs/understand-cost-mgt-data).
72
80
@@ -76,10 +84,12 @@ Once you understand the aggregated consumption at pipeline-run level, there are
76
84
To see the consumption at activity-run level, go to your data factory **Author & Monitor** UI. From the **Monitor** tab where you see a list of pipeline runs, click the **pipeline name** link to access the list of activity runs in the pipeline run. Click on the **Output** button next to the activity name and look for **billableDuration** property in the JSON output:
0 commit comments