Skip to content

Commit 8284d61

Browse files
authored
Merge pull request #293772 from jonburchel/2025-01-27-cicd-author-update
Author updates for CI/CD
2 parents 38ce8f3 + 7a151d9 commit 8284d61

12 files changed

+55
-59
lines changed

articles/data-factory/ci-cd-pattern-with-airflow.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
---
22
title: CI/CD patterns with Workflow Orchestration Manager
33
description: This article talks about recommended deployment patterns with Workflow Orchestration Manager.
4-
author: nabhishek
5-
ms.author: abnarain
4+
author: kromerm
5+
ms.author: makromer
66
ms.reviewer: jburchel
77
ms.topic: how-to
8-
ms.date: 10/17/2023
8+
ms.date: 01/29/2025
99
---
1010

1111
# CI/CD patterns with Workflow Orchestration Manager

articles/data-factory/concepts-data-redundancy.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,23 @@
11
---
22
title: Data redundancy in Azure Data Factory | Microsoft Docs
33
description: 'Learn about meta-data redundancy mechanisms in Azure Data Factory'
4-
author: nabhishek
4+
author: kromerm
5+
ms.author: makromer
56
ms.topic: conceptual
6-
ms.date: 10/03/2024
7+
ms.date: 01/29/2025
78
ms.subservice: data-movement
8-
ms.author: abnarain
99
---
1010

11-
# **Azure Data Factory data redundancy**
11+
# Azure Data Factory data redundancy
1212

1313
Azure Data Factory data includes metadata (pipeline, datasets, linked services, integration runtime, and triggers) and monitoring data (pipeline, trigger, and activity runs).
1414

15-
In all regions (except Brazil South and Southeast Asia), Azure Data Factory data is stored and replicated in the [paired region](../reliability/cross-region-replication-azure.md#azure-paired-regions) to protect against metadata loss. During regional datacenter failures, Microsoft may initiate a regional failover of your Azure Data Factory instance. In most cases, no action is required on your part. When the Microsoft-managed failover has completed, you'll be able to access your Azure Data Factory in the failover region.
15+
In all regions (except Brazil South and Southeast Asia), Azure Data Factory data is stored and replicated in the [paired region](../reliability/cross-region-replication-azure.md#azure-paired-regions) to protect against metadata loss. During regional datacenter failures, Microsoft might initiate a regional failover of your Azure Data Factory instance. In most cases, no action is required on your part. When the Microsoft-managed failover has completed, you are able to access your Azure Data Factory in the failover region.
1616

1717
Due to data residency requirements in Brazil South, and Southeast Asia, Azure Data Factory data is stored on [local region only](../storage/common/storage-redundancy.md#locally-redundant-storage). For Southeast Asia, all the data are stored in Singapore. For Brazil South, all data are stored in Brazil. When the region is lost due to a significant disaster, Microsoft won't be able to recover your Azure Data Factory data.
1818

1919
> [!NOTE]
20-
> Microsoft-managed failover does not apply to self-hosted integration runtime (SHIR) since this infrastructure is typically customer-managed. If the SHIR is set up on Azure VM, then the recommendation is to leverage [Azure site recovery](../site-recovery/site-recovery-overview.md) for handling the [Azure VM failover](../site-recovery/azure-to-azure-architecture.md) to another region.
20+
> Microsoft-managed failover doesn't apply to self-hosted integration runtime (SHIR) since this infrastructure is typically customer-managed. If the SHIR is set up on Azure VM, then the recommendation is to use [Azure Site Recovery](../site-recovery/site-recovery-overview.md) for handling the [Azure VM failover](../site-recovery/azure-to-azure-architecture.md) to another region.
2121
2222

2323

@@ -28,7 +28,7 @@ To ensure you can track and audit the changes made to your metadata, you should
2828
Learn how to set up [source control in Azure Data Factory](./source-control.md).
2929

3030
> [!NOTE]
31-
> In case of a disaster (loss of region), new data factory can be provisioned manually or in an automated fashion. Once the new data factory has been created, you can restore your pipelines, datasets and linked services JSON from the existing Git repository.
31+
> If there is a disaster (loss of region), new data factory can be provisioned manually or in an automated fashion. Once the new data factory has been created, you can restore your pipelines, datasets, and linked services JSON from the existing Git repository.
3232
3333

3434

articles/data-factory/continuous-integration-delivery-automate-azure-pipelines.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,11 @@
22
title: Automate continuous integration
33
description: Learn how to automate continuous integration in Azure Data Factory with Azure Pipelines pipelines releases.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: jburchel
88
ms.topic: conceptual
9-
ms.date: 09/25/2024
10-
ms.custom:
9+
ms.date: 01/29/2025
1110
---
1211

1312
# Automate continuous integration using Azure Pipelines releases

articles/data-factory/continuous-integration-delivery-hotfix-environment.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,11 @@
22
title: Using a hotfix production environment
33
description: Learn how to use a hotfix production environment with continuous integration and delivery in Azure Data Factory pipelines.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: jburchel
88
ms.topic: conceptual
9-
ms.date: 10/20/2023
10-
ms.custom:
9+
ms.date: 01/29/2025
1110
---
1211

1312
# Using a hotfix production environment

articles/data-factory/continuous-integration-delivery-improvements.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
title: Automated publishing for continuous integration and delivery
33
description: Learn how to publish for continuous integration and delivery automatically.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: susabat
88
ms.topic: conceptual
9-
ms.date: 04/09/2024
9+
ms.date: 01/29/2025
1010
---
1111

1212
# Automated publishing for continuous integration and delivery (CI/CD)

articles/data-factory/continuous-integration-delivery-linked-templates.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,11 @@
22
title: Using linked Resource Manager templates
33
description: Learn how to use linked Resource Manager templates with continuous integration and delivery in Azure Data Factory pipelines.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: jburchel
88
ms.topic: conceptual
9-
ms.date: 10/20/2023
10-
ms.custom:
9+
ms.date: 01/29/2025
1110
---
1211

1312
# Linked Resource Manager templates with CI/CD

articles/data-factory/continuous-integration-delivery-manual-promotion.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,11 @@
22
title: Manual promotion of Resource Manager templates
33
description: Learn how to manually promote a Resource Manager template to multiple environments with continuous integration and delivery in Azure Data Factory.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: jburchel
88
ms.topic: conceptual
9-
ms.date: 05/15/2024
10-
ms.custom:
9+
ms.date: 01/29/2025
1110
---
1211

1312
# Manually promote a Resource Manager template to each environment

articles/data-factory/continuous-integration-delivery-resource-manager-custom-parameters.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
title: Custom parameters in a Resource Manager template
33
description: Learn how to use custom parameters in a Resource Manager template with continuous integration and delivery in Azure Data Factory.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: jburchel
88
ms.topic: conceptual
9-
ms.date: 09/26/2024
9+
ms.date: 01/29/2025
1010
---
1111

1212
# Use custom parameters with the Resource Manager template
@@ -20,22 +20,22 @@ If your development instance has an associated Git repository, you can override
2020

2121
To handle custom parameter 256 limit, there are three options:
2222

23-
* Use the custom parameter file and remove properties that don't need parameterization, i.e., properties that can keep a default value and hence decrease the parameter count.
24-
* Refactor logic in the dataflow to reduce parameters, for example, pipeline parameters all have the same value, you can just use global parameters instead.
23+
* Use the custom parameter file and remove properties that don't need parameterization, that is, properties that can keep a default value and hence decrease the parameter count.
24+
* Refactor logic in the dataflow to reduce parameters, for example, pipeline parameters all have the same value. You can just use global parameters instead.
2525
* Split one data factory into multiple data factories.
2626

27-
To override the default Resource Manager parameter configuration, go to the **Manage** hub and select **ARM template** in the "Source control" section. Under **ARM parameter configuration** section, click **Edit** icon in "Edit parameter configuration" to open the Resource Manager parameter configuration code editor.
27+
To override the default Resource Manager parameter configuration, go to the **Manage** hub and select **ARM template** in the "Source control" section. Under **ARM parameter configuration** section, select **Edit** icon in "Edit parameter configuration" to open the Resource Manager parameter configuration code editor.
2828

2929
:::image type="content" source="media/author-management-hub/management-hub-custom-parameters.png" alt-text="Manage custom parameters":::
3030

3131
> [!NOTE]
32-
> **ARM parameter configuration** is only enabled in "GIT mode". Currently it is disabled in "live mode" or "Data Factory" mode.
32+
> **ARM parameter configuration** is only enabled in "GIT mode". Currently it's disabled in "live mode" or "Data Factory" mode.
3333
3434
Creating a custom Resource Manager parameter configuration creates a file named **arm-template-parameters-definition.json** in the root folder of your git branch. You must use that exact file name.
3535

3636
:::image type="content" source="media/continuous-integration-delivery/custom-parameters.png" alt-text="Custom parameters file":::
3737

38-
When publishing from the collaboration branch, Data Factory will read this file and use its configuration to generate which properties get parameterized. If no file is found, the default template is used.
38+
When publishing from the collaboration branch, Data Factory reads this file and use its configuration to generate which properties get parameterized. If no file is found, the default template is used.
3939

4040
When exporting a Resource Manager template, Data Factory reads this file from whichever branch you're currently working on, not the collaboration branch. You can create or edit the file from a private branch, where you can test your changes by selecting **Export ARM Template** in the UI. You can then merge the file into the collaboration branch.
4141

@@ -61,7 +61,7 @@ The following are some guidelines to follow when you create the custom parameter
6161

6262
## Sample parameterization template
6363

64-
Here's an example of what an Resource Manager parameter configuration might look like. It contains examples of a number of possible usages, including parameterization of nested activities within a pipeline and changing the defaultValue of a linked service parameter.
64+
Here's an example of what a Resource Manager parameter configuration might look like. It contains examples of many possible usages, including parameterization of nested activities within a pipeline and changing the defaultValue of a linked service parameter.
6565

6666
```json
6767
{
@@ -156,7 +156,7 @@ Here's an explanation of how the preceding template is constructed, broken down
156156

157157
### Pipelines
158158

159-
* Any property in the path `activities/typeProperties/waitTimeInSeconds` is parameterized. Any activity in a pipeline that has a code-level property named `waitTimeInSeconds` (for example, the `Wait` activity) is parameterized as a number, with a default name. But it won't have a default value in the Resource Manager template. It will be a mandatory input during the Resource Manager deployment.
159+
* Any property in the path `activities/typeProperties/waitTimeInSeconds` is parameterized. Any activity in a pipeline that has a code-level property named `waitTimeInSeconds` (for example, the `Wait` activity) is parameterized as a number, with a default name. But it won't have a default value in the Resource Manager template. It is a mandatory input during the Resource Manager deployment.
160160
* Similarly, a property called `headers` (for example, in a `Web` activity) is parameterized with type `object` (JObject). It has a default value, which is the same value as that of the source factory.
161161

162162
### IntegrationRuntimes
@@ -170,16 +170,16 @@ Here's an explanation of how the preceding template is constructed, broken down
170170

171171
### LinkedServices
172172

173-
* Linked services are unique. Because linked services and datasets have a wide range of types, you can provide type-specific customization. In this example, for all linked services of type `AzureDataLakeStore`, a specific template will be applied. For all others (via `*`), a different template will be applied.
174-
* The `connectionString` property will be parameterized as a `securestring` value. It won't have a default value. It will have a shortened parameter name that's suffixed with `connectionString`.
173+
* Linked services are unique. Because linked services and datasets have a wide range of types, you can provide type-specific customization. In this example, for all linked services of type `AzureDataLakeStore`, a specific template is applied. For all others (via `*`), a different template is applied.
174+
* The `connectionString` property is parameterized as a `securestring` value. It won't have a default value. It has a shortened parameter name that's suffixed with `connectionString`.
175175
* The property `secretAccessKey` happens to be an `AzureKeyVaultSecret` (for example, in an Amazon S3 linked service). It's automatically parameterized as an Azure Key Vault secret and fetched from the configured key vault. You can also parameterize the key vault itself.
176176

177177
### Datasets
178178

179179
* Although type-specific customization is available for datasets, you can provide configuration without explicitly having a \*-level configuration. In the preceding example, all dataset properties under `typeProperties` are parameterized.
180180

181181
> [!NOTE]
182-
> If **Azure alerts and matrices** are configured for a pipeline, they are not currently supported as parameters for ARM deployments. To reapply the alerts and matrices in new environment, please follow [Data Factory Monitoring, Alerts and Matrices.](./monitor-metrics-alerts.md)
182+
> If **Azure alerts and matrices** are configured for a pipeline, they aren't currently supported as parameters for ARM template deployments. To reapply the alerts and matrices in new environment, follow [Data Factory Monitoring, Alerts, and Matrices.](./monitor-metrics-alerts.md)
183183
>
184184
185185
## Default parameterization template
@@ -331,7 +331,7 @@ Below is the current default parameterization template. If you need to add only
331331

332332
## Example: Parameterizing an existing Azure Databricks interactive cluster ID
333333

334-
The following example shows how to add a single value to the default parameterization template. We only want to add an existing Azure Databricks interactive cluster ID for a Databricks linked service to the parameters file. Note that this file is the same as the previous file except for the addition of `existingClusterId` under the properties field of `Microsoft.DataFactory/factories/linkedServices`.
334+
The following example shows how to add a single value to the default parameterization template. We only want to add an existing Azure Databricks interactive cluster ID for a Databricks linked service to the parameters file. This file is the same as the previous file except for the addition of `existingClusterId` under the properties field of `Microsoft.DataFactory/factories/linkedServices`.
335335

336336
```json
337337
{

articles/data-factory/continuous-integration-delivery-sample-script.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
title: Continuous integration and delivery pre- and post-deployment scripts
33
description: Learn how to use a pre- and post-deployment script with continuous integration and delivery in Azure Data Factory from this sample.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: jburchel
88
ms.topic: conceptual
9-
ms.date: 09/26/2024
9+
ms.date: 01/29/2025
1010
ms.custom: devx-track-azurepowershell
1111
---
1212

articles/data-factory/continuous-integration-delivery.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
title: Continuous integration and delivery
33
description: Learn how to use continuous integration and delivery to move Azure Data Factory pipelines from one environment (development, test, production) to another.
44
ms.subservice: ci-cd
5-
author: nabhishek
6-
ms.author: abnarain
5+
author: kromerm
6+
ms.author: makromer
77
ms.reviewer: jburchel
88
ms.topic: conceptual
9-
ms.date: 09/25/2024
9+
ms.date: 01/29/2025
1010
ms.custom:
1111
---
1212

0 commit comments

Comments
 (0)