Skip to content

Commit bb2219b

Browse files
authored
Merge pull request #208902 from jonburchel/2022-08-23-pricing-page-updates
Pricing page updates
2 parents b847261 + 0ccd63b commit bb2219b

20 files changed

+464
-214
lines changed

articles/data-factory/TOC.yml

Lines changed: 20 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1101,8 +1101,26 @@ items:
11011101
href: plan-manage-costs.md
11021102
displayName: diu
11031103
- name: Pricing examples
1104-
href: pricing-concepts.md
1105-
displayName: diu
1104+
items:
1105+
- name: Overview
1106+
href: pricing-concepts.md
1107+
displayName: diu
1108+
- name: Copy data from AWS S3 to Azure Blob storage
1109+
href: pricing-examples-s3-to-blob.md
1110+
- name: Copy data and transform with Azure Databricks
1111+
href: pricing-examples-copy-transform-azure-databricks.md
1112+
- name: Copy/transform data with dynamic parameters
1113+
href: pricing-examples-copy-transform-dynamic-parameters.md
1114+
- name: Run SSIS packages on Azure-SSIS integration runtime
1115+
href: pricing-examples-ssis-on-azure-ssis-integration-runtime.md
1116+
- name: Using mapping data flow debug for a workday
1117+
href: pricing-examples-mapping-data-flow-debug-workday.md
1118+
- name: Transform blob data with mapping data flows
1119+
href: pricing-examples-transform-mapping-data-flows.md
1120+
- name: Data integration with Managed VNET
1121+
href: pricing-examples-data-integration-managed-vnet.md
1122+
- name: Get delta data from SAP ECC via SAP CDC in mapping data flows
1123+
href: pricing-examples-get-delta-data-from-sap-ecc.md
11061124
- name: Troubleshooting guides
11071125
items:
11081126
- name: Azure Data Factory Studio

articles/data-factory/better-understand-different-integration-runtime-charges.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ In this article, we'll illustrate the pricing model using different integration
1818
The integration runtime, which is serverless in Azure and self-hosted in hybrid scenarios, provides the compute resources used to execute the activities in a pipeline. Integration runtime charges are prorated by the minute and rounded up.
1919

2020
> [!NOTE]
21-
> The prices used in these examples below are hypothetical and are not intended to imply actual pricing.
21+
> The prices used in this example below are hypothetical and are not intended to imply actual pricing.
2222
2323
## Azure integration runtime
2424

articles/data-factory/frequently-asked-questions.yml

Lines changed: 12 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ metadata:
77
ms.service: data-factory
88
ms.subservice:
99
ms.topic: faq
10-
ms.date: 08/05/2022
10+
ms.date: 08/24/2022
1111
title: Azure Data Factory FAQ
1212
summary: |
1313
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
@@ -49,7 +49,7 @@ sections:
4949
- Looping containers:
5050
* The foreach activity will iterate over a specified collection of activities in a loop.
5151
- Trigger-based flows:
52-
- Pipelines can be triggered on demand, by wall-clock time, or in response to driven by event grid topics
52+
- Pipelines can be triggered on demand, by wall-clock time, or in response to driven by Event Grid topics
5353
- Delta flows:
5454
- Parameters can be used to define your high-water mark for delta copy while moving dimension or reference tables from a relational store, either on-premises or in the cloud, to load the data into the lake.
5555
@@ -223,14 +223,18 @@ sections:
223223
### How do I gracefully handle null values in an activity output?
224224
225225
You can use the `@coalesce` construct in the expressions to handle null values gracefully.
226+
227+
### How many pipeline activities can be executed simultaneously?
228+
229+
A maximum of 50 concurrent pipeline activities is allowed. The 51st pipeline activity will be queued until a free slot is opened up. A maximum of 800 concurrent external activities will be allowed, after which they will be queued in the same way.
226230
227231
- question: |
228232
Mapping data flows
229233
answer: |
230234
### I need help troubleshooting my data flow logic. What info do I need to provide to get help?
231235
232236
When Microsoft provides help or troubleshooting with data flows, please provide the ADF pipeline support files.
233-
This Zip file contains the code-behind script from your data flow graph. From the ADF UI, click **...** next to pipeline, and then click **Download support files**.
237+
This Zip file contains the code-behind script from your data flow graph. From the ADF UI, select **...** next to pipeline, and then select **Download support files**.
234238
235239
### How do I access data by using the other 90 dataset types in Data Factory?
236240
@@ -240,23 +244,23 @@ sections:
240244
241245
### Is the self-hosted integration runtime available for data flows?
242246
243-
Self-hosted IR is an ADF pipeline construct that you can use with the Copy Activity to acquire or move data to and from on-prem or VM-based data sources and sinks. The virtual machines that you use for a self-hosted IR can also be placed inside of the same VNET as your protected data stores for access to those data stores from ADF. With data flows, you'll achieve these same end-results using the Azure IR with managed VNET instead.
247+
Self-hosted IR is an ADF pipeline construct that you can use with the Copy Activity to acquire or move data to and from on-premises or VM-based data sources and sinks. The virtual machines that you use for a self-hosted IR can also be placed inside of the same VNET as your protected data stores for access to those data stores from ADF. With data flows, you'll achieve these same end-results using the Azure IR with managed VNET instead.
244248
245249
### Does the data flow compute engine serve multiple tenants?
246250
247251
Clusters are never shared. We guarantee isolation for each job run in production runs. In case of debug scenario one person gets one cluster, and all debugs will go to that cluster which are initiated by that user.
248252
249-
### Is there a way to write attributes in cosmos db in the same order as specified in the sink in ADF data flow?
253+
### Is there a way to write attributes in Cosmos DB in the same order as specified in the sink in ADF data flow?
250254
251-
For cosmos DB, the underlying format of each document is a JSON object which is an unordered set of name/value pairs, so the order cannot be reserved.
255+
For Cosmos DB, the underlying format of each document is a JSON object which is an unordered set of name/value pairs, so the order cannot be reserved.
252256
253257
### Why a user is unable to use data preview in the data flows?
254258
255259
You should check permissions for custom role. There are multiple actions involved in the dataflow data preview. You start by checking network traffic while debugging on your browser. Please follow all of the actions, for details, please refer to [Resource provider.](../role-based-access-control/resource-provider-operations.md#microsoftdatafactory)
256260
257261
### In ADF, can I calculate value for a new column from existing column from mapping?
258262
259-
You can use derive transformation in mapping data flow to create a new column on the logic you want. When creating a derived column, you can either generate a new column or update an existing one. In the Column textbox, enter in the column you are creating. To override an existing column in your schema, you can use the column dropdown. To build the derived column's expression, click on the Enter expression textbox. You can either start typing your expression or open up the expression builder to construct your logic.
263+
You can use derive transformation in mapping data flow to create a new column on the logic you want. When creating a derived column, you can either generate a new column or update an existing one. In the Column textbox, enter in the column you are creating. To override an existing column in your schema, you can use the column dropdown. To build the derived column's expression, select on the Enter expression textbox. You can either start typing your expression or open up the expression builder to construct your logic.
260264
261265
### Why mapping data flow preview failing with Gateway timeout?
262266
@@ -279,7 +283,7 @@ sections:
279283
Data factory is available in following [regions.](https://azure.microsoft.com/global-infrastructure/services/?products=data-factory)
280284
The Power Query feature is available in all data flow regions. If the feature is not available in your region, please check with support.
281285
282-
### What is the difference between mapping data flow and Power query actvity (data wrangling)?
286+
### What is the difference between mapping data flow and Power query activity (data wrangling)?
283287
284288
Mapping data flows provide a way to transform data at scale without any coding required. You can design a data transformation job in the data flow canvas by constructing a series of transformations. Start with any number of source transformations followed by data transformation steps. Complete your data flow with a sink to land your results in a destination. Mapping data flow is great at mapping and transforming data with both known and unknown schemas in the sinks and sources.
285289
53.6 KB
Loading
53 KB
Loading
53.8 KB
Loading
42 KB
Loading
14.1 KB
Loading
62.8 KB
Loading
106 KB
Loading

0 commit comments

Comments
 (0)