Skip to content

Commit 278d236

Browse files
authored
Update pricing-concepts.md
1 parent fc969f2 commit 278d236

File tree

1 file changed

+8
-8
lines changed

1 file changed

+8
-8
lines changed

articles/data-factory/pricing-concepts.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.reviewer: maghan
99
ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
12-
ms.date: 09/25/2018
12+
ms.date: 12/27/2019
1313
---
1414

1515
# Understanding Data Factory pricing through examples
@@ -121,13 +121,13 @@ To accomplish the scenario, you need to create a pipeline with the following ite
121121
- Pipeline Activity = $0.00003 (Prorated for 1 minute of execution time. $0.002/hour on Azure Integration Runtime)
122122
- External Pipeline Activity = $0.000041 (Prorated for 10 minutes of execution time. $0.00025/hour on Azure Integration Runtime)
123123

124-
## Using mapping data flow debug for a normal workday (Preview Pricing)
124+
## Using mapping data flow debug for a normal workday
125125

126-
As a Data Engineer, you are responsible for designing, building, and testing mapping data flows every day. You log into the ADF UI in the morning and enable the Debug mode for Data Flows. The default TTL for Debug sessions is 60 minutes. You work throughout the day for 10 hours, so your Debug session never expires. Therefore, your charge for the day will be:
126+
As a Data Engineer, you are responsible for designing, building, and testing mapping data flows every day. You log into the ADF UI in the morning and enable the Debug mode for Data Flows. The default TTL for Debug sessions is 60 minutes. You work throughout the day for 8 hours, so your Debug session never expires. Therefore, your charge for the day will be:
127127

128-
**10 (hours) x 8 (cores) x $0.112 = $8.96**
128+
**8 (hours) x 8 (compute-optimized cores) x $0.193 = $12.35**
129129

130-
## Transform data in blob store with mapping data flows (Preview Pricing)
130+
## Transform data in blob store with mapping data flows
131131

132132
In this scenario, you want to transform data in Blob Store visually in ADF mapping data flows on an hourly schedule.
133133

@@ -148,17 +148,17 @@ To accomplish the scenario, you need to create a pipeline with the following ite
148148
| Create Pipeline | 3 Read/Write entities (1 for pipeline creation, 2 for dataset references) |
149149
| Get Pipeline | 1 Read/Write entity |
150150
| Run Pipeline | 2 Activity runs (1 for trigger run, 1 for activity runs) |
151-
| Data Flow Assumptions: execution time = 10 min + 10 min TTL | 10 \* 8 cores of General Compute with TTL of 10 |
151+
| Data Flow Assumptions: execution time = 10 min + 10 min TTL | 10 \* 16 cores of General Compute with TTL of 10 |
152152
| Monitor Pipeline Assumption: Only 1 run occurred | 2 Monitoring run records retried (1 for pipeline run, 1 for activity run) |
153153

154154
**Total Scenario pricing: $0.3011**
155155

156156
- Data Factory Operations = **$0.0001**
157157
- Read/Write = 10\*00001 = $0.0001 [1 R/W = $0.50/50000 = 0.00001]
158158
- Monitoring = 2\*000005 = $0.00001 [1 Monitoring = $0.25/50000 = 0.000005]
159-
- Pipeline Orchestration & Execution = **$0.301**
159+
- Pipeline Orchestration & Execution = **$1.463**
160160
- Activity Runs = 001\*2 = 0.002 [1 run = $1/1000 = 0.001]
161-
- Data Flow Activities = $0.299 Prorated for 20 minutes (10 mins execution time + 10 mins TTL). $0.112/hour on Azure Integration Runtime with 8 cores general compute
161+
- Data Flow Activities = $1.461 prorated for 20 minutes (10 mins execution time + 10 mins TTL). $0.274/hour on Azure Integration Runtime with 16 cores general compute
162162

163163
## Next steps
164164

0 commit comments

Comments
 (0)