Skip to content

Commit c7cb57b

Browse files
authored
Merge pull request #300993 from whhender/epd-freshness-june-2025
EPD Freshness June 2025
2 parents c0de150 + e3166fe commit c7cb57b

File tree

4 files changed

+36
-34
lines changed

4 files changed

+36
-34
lines changed
Lines changed: 16 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,65 +1,65 @@
11
---
2-
title: Deploy linked ARM templates with VSTS
2+
title: Deploy linked ARM templates with Azure DevOps Services
33
titleSuffix: Azure Data Factory & Azure Synapse
44
description: Learn how to deploy linked ARM templates with Azure DevOps Services (formerly Visual Studio Team Services, or VSTS).
55
author: whhender
66
ms.custom: synapse
77
ms.topic: how-to
8-
ms.date: 05/15/2024
8+
ms.date: 06/06/2025
99
ms.author: whhender
1010
ms.subservice: authoring
1111
---
12-
# Deploy linked ARM templates with VSTS
12+
# Deploy linked ARM templates with Azure DevOps Services
1313

1414
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
1515

1616
This article describes how to deploy linked Azure Resource Manager (ARM) templates with Azure DevOps Services (formerly Visual Studio Team Services, or VSTS).
1717

1818
## Overview
1919

20-
When dealing with deploying many components in Azure, a single ARM template might be challenging to manage and maintain. ARM linked templates allow you to make your deployment more modular and makes the templates easier to manage. When dealing with large deployments, it's highly recommended to consider breaking down your deployment into a main template and multiple linked templates representing different components of your deployment.
20+
When you're deploying many components in Azure, a single ARM template might be challenging to manage and maintain. ARM linked templates allow you to make your deployment more modular and makes the templates easier to manage. When dealing with large deployments, it's highly recommended to consider breaking down your deployment into a main template and multiple linked templates representing different components of your deployment.
2121

22-
Deploying ARM templates can be performed using several different methods such as using PowerShell, Azure CLI, and Azure portal. A recommended approach however is to adopt one of DevOps practices, namely continuous deployment. VSTS is an application lifecycle management tool hosted in the cloud and offered as a service. One of the capabilities VSTS offers is release management.
22+
Deploying ARM templates can be performed using several different methods such as using PowerShell, Azure CLI, and Azure portal. A recommended approach however is to adopt one of DevOps practices, namely continuous deployment. Azure DevOps Services is an application lifecycle management tool hosted in the cloud and offered as a service. One of the capabilities Azure DevOps Services offers is release management.
2323

24-
This article describes how you can deploy linked ARM templates using the release management feature of VSTS. In order for the linked templates to be deployed properly, they need to be stored in a location that can be reached by the Azure Resource Manager, such as Azure Storage; so we show how Azure Storage can be used to stage the ARM template files. We will also show some recommended practices around keeping secrets protected using Azure Key Vault.
24+
This article describes how you can deploy linked ARM templates using the release management feature of Azure DevOps Services. In order for the linked templates to be deployed properly, they need to be stored in a location that can be reached by the Azure Resource Manager, such as Azure Storage; so we show how Azure Storage can be used to stage the ARM template files. We'll also show some recommended practices around keeping secrets protected using Azure Key Vault.
2525

26-
The scenario we walk through here's to deploy VNet with a Network Security Group (NSG) structured as linked templates. We use VSTS to show how continuous deployment can be set up to enable teams to continuously update Azure with new changes each time there's a modification to the template.
26+
This scenario deploys a virtual network with a Network Security Group (NSG) structured as linked templates. We use Azure DevOps Services to show how continuous deployment can be set up to enable teams to continuously update Azure with new changes each time there's a modification to the template.
2727

2828
## Create an Azure Storage account
2929

3030
1. Sign in to the Azure portal and create an Azure Storage account following the steps documented [here](../storage/common/storage-account-create.md?tabs=azure-portal).
31-
1. Once deployment is complete, navigate to the storage account and select **Shared access signature**. Select Service, Container, and Object for the **Allowed resource types**. Then select **Generate SAS and connection string**. Copy the SAS token and keep it available since we use it later.
31+
1. Once deployment is complete, navigate to the storage account and select **Shared access signature**. Select Service, Container, and Object for the **Allowed resource types**. Then select **Generate SAS and connection string**. Copy the SAS token and keep it available since we use it later.
3232

3333
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\storage-account-generate-sas-token.png" alt-text="Shows an Azure Storage Account in the Azure portal with Shared access signature selected." lightbox="media\deploy-linked-arm-templates-with-vsts\storage-account-generate-sas-token.png":::
3434

3535
1. Select the storage account Containers page and create a new Container.
3636
1. Select the new Container properties.
3737

38-
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\container-properties.png" alt-text="Shows an Azure Storage Account in the Azure portal with Containers selected. There's a container with its Container properties menu selected.":::
38+
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\container-properties.png" alt-text="Screenshot shows an Azure Storage Account in the Azure portal with Containers selected. There's a container with its Container properties menu selected.":::
3939

40-
1. Copy the URL field and keep it handy. We need it later along with the SAS token from the earlier step.
40+
1. Copy the URL field and keep it handy. We need it later along with the SAS token from the earlier step.
4141

4242
## Protect secrets with Azure Key Vault
4343

44-
1. In the Azure portal, create an Azure Key Vault resource.
45-
1. Select the Azure Key Vault you created in the earlier step and then select Secrets.
44+
1. In the Azure portal, create an [Azure Key Vault](/azure/key-vault/general/quick-create-portal) resource.
45+
1. Select the created Azure Key Vault and then select **Secrets**.
4646
1. Select Generate/Import to add the SAS Token.
4747
1. For the Name property, enter `StorageSASToken` and then provide the Azure Storage shared access signature key you copied in a previous step for the Value.
4848
1. Select Create.
4949

50-
## Link Azure Key Vault to VSTS
50+
## Link Azure Key Vault to Azure DevOps Services
5151

5252
1. Sign in to your Azure DevOps organization and navigate to your project.
5353
1. Go to **Library** under **Pipelines** in the navigation pane.
5454

55-
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\vsts-libraries.png" alt-text="Shows the navigation pane in VSTS with Pipelines selected and the Library option highlighted.":::
55+
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\vsts-libraries.png" alt-text="Screenshot shows the navigation pane in Azure DevOps Services with Pipelines selected and the Library option highlighted.":::
5656

5757
1. Under **Variable group**, create a new group and for **Variable group name** enter `AzureKeyVaultSecrets`.
5858
1. Toggle **Link secrets from an Azure key vault as variables**.
5959
1. Select your Azure subscription and then the Azure Key Vault you created earlier, and then select Authorize.
6060
1. Once authorization is successful, you can add variables by clicking **Add** and are presented with the option to add references to the secrets in the Azure Key Vault. Add a reference to the `StorageSASToken` created in the earlier step, and save it.
6161

62-
## Setup continuous deployment using VSTS
62+
## Set up continuous deployment using Azure DevOps Services
6363

6464
1. Follow steps listed in the article [Automate continuous integration using Azure Pipelines releases](continuous-integration-delivery-automate-azure-pipelines.md#set-up-an-azure-pipelines-release).
6565
1. A few changes are required from the above steps in order to use a linked ARM template deployment:
@@ -77,4 +77,5 @@ The scenario we walk through here's to deploy VNet with a Network Security Grou
7777
1. Save the release pipeline and trigger a release.
7878

7979
## Related content
80+
8081
- [Automate continuous integration using Azure Pipelines releases](continuous-integration-delivery-automate-azure-pipelines.md)

articles/data-factory/quickstart-create-data-factory-azure-cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: whhender
66
ms.reviewer: jianleishen
77
ms.subservice: data-movement
88
ms.topic: quickstart
9-
ms.date: 05/15/2024
9+
ms.date: 06/06/2025
1010
ms.custom: template-quickstart, devx-track-azurecli, mode-api
1111
---
1212

articles/data-factory/quickstart-create-data-factory-bicep.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.subservice: data-movement
77
ms.author: whhender
88
ms.topic: quickstart
99
ms.custom: subject-armqs, mode-arm, devx-track-bicep
10-
ms.date: 05/15/2024
10+
ms.date: 06/06/2025
1111
---
1212

1313
# Quickstart: Create an Azure Data Factory using Bicep
@@ -19,7 +19,7 @@ This quickstart describes how to use Bicep to create an Azure data factory. The
1919
[!INCLUDE [About Bicep](~/reusable-content/ce-skilling/azure/includes/resource-manager-quickstart-bicep-introduction.md)]
2020

2121
> [!NOTE]
22-
> This article does not provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
22+
> This article doesn't provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
2323
2424
## Prerequisites
2525

articles/data-factory/quickstart-create-data-factory-powershell.md

Lines changed: 17 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,20 @@ author: whhender
55
ms.subservice: data-movement
66
ms.devlang: powershell
77
ms.topic: quickstart
8-
ms.date: 05/15/2024
8+
ms.date: 06/06/2025
99
ms.author: whhender
1010
ms.reviewer: jianleishen
1111
ms.custom: devx-track-azurepowershell, mode-api
1212
---
1313
# Quickstart: Create an Azure Data Factory using PowerShell
1414

15-
1615
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
1716

1817
This quickstart describes how to use PowerShell to create an Azure Data Factory. The pipeline you create in this data factory **copies** data from one folder to another folder in an Azure blob storage. For a tutorial on how to **transform** data using Azure Data Factory, see [Tutorial: Transform data using Spark](transform-data-using-spark.md).
1918

2019
> [!NOTE]
21-
> This article does not provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
20+
> This article doesn't provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
21+
2222

2323
[!INCLUDE [data-factory-quickstart-prerequisites](includes/data-factory-quickstart-prerequisites.md)]
2424

@@ -29,9 +29,9 @@ This quickstart describes how to use PowerShell to create an Azure Data Factory.
2929
Install the latest Azure PowerShell modules by following instructions in [How to install and configure Azure PowerShell](/powershell/azure/install-azure-powershell).
3030

3131
>[!WARNING]
32-
>If you do not use latest versions of PowerShell and Data Factory module, you may run into deserialization errors while running the commands.
32+
>If you don't use latest versions of PowerShell and Data Factory module, you could run into deserialization errors while running the commands.
3333
34-
#### Log in to PowerShell
34+
#### Sign in to PowerShell
3535

3636
1. Launch **PowerShell** on your machine. Keep PowerShell open until the end of this quickstart. If you close and reopen, you need to run these commands again.
3737

@@ -61,15 +61,15 @@ Install the latest Azure PowerShell modules by following instructions in [How to
6161
$resourceGroupName = "ADFQuickStartRG";
6262
```
6363
64-
If the resource group already exists, you may not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again
64+
If the resource group already exists, you might not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again
6565
6666
2. To create the Azure resource group, run the following command:
6767
6868
```powershell
6969
$ResGrp = New-AzResourceGroup $resourceGroupName -location 'East US'
7070
```
7171
72-
If the resource group already exists, you may not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again.
72+
If the resource group already exists, you might not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again.
7373
7474
3. Define a variable for the data factory name.
7575
@@ -95,7 +95,7 @@ Note the following points:
9595
The specified Data Factory name 'ADFv2QuickStartDataFactory' is already in use. Data Factory names must be globally unique.
9696
```
9797
98-
* To create Data Factory instances, the user account you use to log in to Azure must be a member of **contributor** or **owner** roles, or an **administrator** of the Azure subscription.
98+
* To create Data Factory instances, the user account you use to sign in to Azure must be a member of **contributor** or **owner** roles, or an **administrator** of the Azure subscription.
9999
100100
* For a list of Azure regions in which Data Factory is currently available, select the regions that interest you on the following page, and then expand **Analytics** to locate **Data Factory**: [Products available by region](https://azure.microsoft.com/global-infrastructure/services/). The data stores (Azure Storage, Azure SQL Database, etc.) and computes (HDInsight, etc.) used by data factory can be in other regions.
101101
@@ -105,10 +105,10 @@ Note the following points:
105105
Create linked services in a data factory to link your data stores and compute services to the data factory. In this quickstart, you create an Azure Storage linked service that is used as both the source and sink stores. The linked service has the connection information that the Data Factory service uses at runtime to connect to it.
106106
107107
>[!TIP]
108-
>In this quickstart, you use *Account key* as the authentication type for your data store, but you can choose other supported authentication methods: *SAS URI*,*Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
108+
>In this quickstart, you use *Account key* as the authentication type for your data store, but you can choose other supported authentication methods: *SAS URI*, *Service Principal, and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
109109
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](./store-credentials-in-key-vault.md) for detailed illustrations.
110110
111-
1. Create a JSON file named **AzureStorageLinkedService.json** in **C:\ADFv2QuickStartPSH** folder with the following content: (Create the folder ADFv2QuickStartPSH if it does not already exist.).
111+
1. Create a JSON file named **AzureStorageLinkedService.json** in **C:\ADFv2QuickStartPSH** folder with the following content: (Create the folder ADFv2QuickStartPSH if it doesn't already exist.).
112112
113113
> [!IMPORTANT]
114114
> Replace <accountName> and <accountKey> with name and key of your Azure storage account before saving the file.
@@ -126,7 +126,7 @@ Create linked services in a data factory to link your data stores and compute se
126126
}
127127
```
128128
129-
If you are using Notepad, select **All files** for the **Save as type** filed in the **Save as** dialog box. Otherwise, it may add `.txt` extension to the file. For example, `AzureStorageLinkedService.json.txt`. If you create the file in File Explorer before opening it in Notepad, you may not see the `.txt` extension since the **Hide extensions for known files types** option is set by default. Remove the `.txt` extension before proceeding to the next step.
129+
If you're using Notepad, select **All files** for the **Save as type** filed in the **Save as** dialog box. Otherwise, it might add `.txt` extension to the file. For example, `AzureStorageLinkedService.json.txt`. If you create the file in File Explorer before opening it in Notepad, you might not see the `.txt` extension since the **Hide extensions for known files types** option is set by default. Remove the `.txt` extension before proceeding to the next step.
130130
131131
2. In **PowerShell**, switch to the **ADFv2QuickStartPSH** folder.
132132
@@ -142,7 +142,7 @@ Create linked services in a data factory to link your data stores and compute se
142142
-DefinitionFile ".\AzureStorageLinkedService.json"
143143
```
144144
145-
Here is the sample output:
145+
Here's the sample output:
146146
147147
```console
148148
LinkedServiceName : AzureStorageLinkedService
@@ -155,7 +155,8 @@ Create linked services in a data factory to link your data stores and compute se
155155
156156
In this procedure, you create two datasets: **InputDataset** and **OutputDataset**. These datasets are of type **Binary**. They refer to the Azure Storage linked service that you created in the previous section.
157157
The input dataset represents the source data in the input folder. In the input dataset definition, you specify the blob container (**adftutorial**), the folder (**input**), and the file (**emp.txt**) that contain the source data.
158-
The output dataset represents the data that's copied to the destination. In the output dataset definition, you specify the blob container (**adftutorial**), the folder (**output**), and the file to which the data is copied.
158+
The output dataset represents the data that's copied to the destination. In the output dataset definition, you specify the blob container (**adftutorial**), the folder (**output**), and the file to which the data is copied.
159+
159160
1. Create a JSON file named **InputDataset.json** in the **C:\ADFv2QuickStartPSH** folder, with the following content:
160161
161162
```json
@@ -188,7 +189,7 @@ The output dataset represents the data that's copied to the destination. In the
188189
-DefinitionFile ".\InputDataset.json"
189190
```
190191
191-
Here is the sample output:
192+
Here's the sample output:
192193
193194
```console
194195
DatasetName : InputDataset
@@ -229,7 +230,7 @@ The output dataset represents the data that's copied to the destination. In the
229230
-DefinitionFile ".\OutputDataset.json"
230231
```
231232
232-
Here is the sample output:
233+
Here's the sample output:
233234
234235
```console
235236
DatasetName : OutputDataset
@@ -343,7 +344,7 @@ $RunId = Invoke-AzDataFactoryV2Pipeline `
343344
}
344345
```
345346
346-
Here is the sample output of pipeline run:
347+
Here's the sample output of pipeline run:
347348
348349
```console
349350
Pipeline is running...status: InProgress

0 commit comments

Comments
 (0)