You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This article describes how to deploy linked Azure Resource Manager (ARM) templates with Azure DevOps Services (formerly Visual Studio Team Services, or VSTS).
17
17
18
18
## Overview
19
19
20
-
When dealing with deploying many components in Azure, a single ARM template might be challenging to manage and maintain. ARM linked templates allow you to make your deployment more modular and makes the templates easier to manage. When dealing with large deployments, it's highly recommended to consider breaking down your deployment into a main template and multiple linked templates representing different components of your deployment.
20
+
When you're deploying many components in Azure, a single ARM template might be challenging to manage and maintain. ARM linked templates allow you to make your deployment more modular and makes the templates easier to manage. When dealing with large deployments, it's highly recommended to consider breaking down your deployment into a main template and multiple linked templates representing different components of your deployment.
21
21
22
-
Deploying ARM templates can be performed using several different methods such as using PowerShell, Azure CLI, and Azure portal. A recommended approach however is to adopt one of DevOps practices, namely continuous deployment. VSTS is an application lifecycle management tool hosted in the cloud and offered as a service. One of the capabilities VSTS offers is release management.
22
+
Deploying ARM templates can be performed using several different methods such as using PowerShell, Azure CLI, and Azure portal. A recommended approach however is to adopt one of DevOps practices, namely continuous deployment. Azure DevOps Services is an application lifecycle management tool hosted in the cloud and offered as a service. One of the capabilities Azure DevOps Services offers is release management.
23
23
24
-
This article describes how you can deploy linked ARM templates using the release management feature of VSTS. In order for the linked templates to be deployed properly, they need to be stored in a location that can be reached by the Azure Resource Manager, such as Azure Storage; so we show how Azure Storage can be used to stage the ARM template files. We will also show some recommended practices around keeping secrets protected using Azure Key Vault.
24
+
This article describes how you can deploy linked ARM templates using the release management feature of Azure DevOps Services. In order for the linked templates to be deployed properly, they need to be stored in a location that can be reached by the Azure Resource Manager, such as Azure Storage; so we show how Azure Storage can be used to stage the ARM template files. We'll also show some recommended practices around keeping secrets protected using Azure Key Vault.
25
25
26
-
The scenario we walk through here's to deploy VNet with a Network Security Group (NSG) structured as linked templates. We use VSTS to show how continuous deployment can be set up to enable teams to continuously update Azure with new changes each time there's a modification to the template.
26
+
This scenario deploys a virtual network with a Network Security Group (NSG) structured as linked templates. We use Azure DevOps Services to show how continuous deployment can be set up to enable teams to continuously update Azure with new changes each time there's a modification to the template.
27
27
28
28
## Create an Azure Storage account
29
29
30
30
1. Sign in to the Azure portal and create an Azure Storage account following the steps documented [here](../storage/common/storage-account-create.md?tabs=azure-portal).
31
-
1. Once deployment is complete, navigate to the storage account and select **Shared access signature**. Select Service, Container, and Object for the **Allowed resource types**. Then select **Generate SAS and connection string**. Copy the SAS token and keep it available since we use it later.
31
+
1. Once deployment is complete, navigate to the storage account and select **Shared access signature**. Select Service, Container, and Object for the **Allowed resource types**. Then select **Generate SAS and connection string**. Copy the SAS token and keep it available since we use it later.
32
32
33
33
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\storage-account-generate-sas-token.png" alt-text="Shows an Azure Storage Account in the Azure portal with Shared access signature selected." lightbox="media\deploy-linked-arm-templates-with-vsts\storage-account-generate-sas-token.png":::
34
34
35
35
1. Select the storage account Containers page and create a new Container.
36
36
1. Select the new Container properties.
37
37
38
-
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\container-properties.png" alt-text="Shows an Azure Storage Account in the Azure portal with Containers selected. There's a container with its Container properties menu selected.":::
38
+
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\container-properties.png" alt-text="Screenshot shows an Azure Storage Account in the Azure portal with Containers selected. There's a container with its Container properties menu selected.":::
39
39
40
-
1. Copy the URL field and keep it handy. We need it later along with the SAS token from the earlier step.
40
+
1. Copy the URL field and keep it handy. We need it later along with the SAS token from the earlier step.
41
41
42
42
## Protect secrets with Azure Key Vault
43
43
44
-
1. In the Azure portal, create an Azure Key Vault resource.
45
-
1. Select the Azure Key Vault you created in the earlier step and then select Secrets.
44
+
1. In the Azure portal, create an [Azure Key Vault](/azure/key-vault/general/quick-create-portal) resource.
45
+
1. Select the created Azure Key Vault and then select **Secrets**.
46
46
1. Select Generate/Import to add the SAS Token.
47
47
1. For the Name property, enter `StorageSASToken` and then provide the Azure Storage shared access signature key you copied in a previous step for the Value.
48
48
1. Select Create.
49
49
50
-
## Link Azure Key Vault to VSTS
50
+
## Link Azure Key Vault to Azure DevOps Services
51
51
52
52
1. Sign in to your Azure DevOps organization and navigate to your project.
53
53
1. Go to **Library** under **Pipelines** in the navigation pane.
54
54
55
-
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\vsts-libraries.png" alt-text="Shows the navigation pane in VSTS with Pipelines selected and the Library option highlighted.":::
55
+
:::image type="content" source="media\deploy-linked-arm-templates-with-vsts\vsts-libraries.png" alt-text="Screenshot shows the navigation pane in Azure DevOps Services with Pipelines selected and the Library option highlighted.":::
56
56
57
57
1. Under **Variable group**, create a new group and for **Variable group name** enter `AzureKeyVaultSecrets`.
58
58
1. Toggle **Link secrets from an Azure key vault as variables**.
59
59
1. Select your Azure subscription and then the Azure Key Vault you created earlier, and then select Authorize.
60
60
1. Once authorization is successful, you can add variables by clicking **Add** and are presented with the option to add references to the secrets in the Azure Key Vault. Add a reference to the `StorageSASToken` created in the earlier step, and save it.
61
61
62
-
## Setup continuous deployment using VSTS
62
+
## Set up continuous deployment using Azure DevOps Services
63
63
64
64
1. Follow steps listed in the article [Automate continuous integration using Azure Pipelines releases](continuous-integration-delivery-automate-azure-pipelines.md#set-up-an-azure-pipelines-release).
65
65
1. A few changes are required from the above steps in order to use a linked ARM template deployment:
@@ -77,4 +77,5 @@ The scenario we walk through here's to deploy VNet with a Network Security Grou
77
77
1. Save the release pipeline and trigger a release.
78
78
79
79
## Related content
80
+
80
81
-[Automate continuous integration using Azure Pipelines releases](continuous-integration-delivery-automate-azure-pipelines.md)
> This article does not provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
22
+
> This article doesn't provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
This quickstart describes how to use PowerShell to create an Azure Data Factory. The pipeline you create in this data factory **copies** data from one folder to another folder in an Azure blob storage. For a tutorial on how to **transform** data using Azure Data Factory, see [Tutorial: Transform data using Spark](transform-data-using-spark.md).
19
18
20
19
> [!NOTE]
21
-
> This article does not provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
20
+
> This article doesn't provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
@@ -29,9 +29,9 @@ This quickstart describes how to use PowerShell to create an Azure Data Factory.
29
29
Install the latest Azure PowerShell modules by following instructions in [How to install and configure Azure PowerShell](/powershell/azure/install-azure-powershell).
30
30
31
31
>[!WARNING]
32
-
>If you do not use latest versions of PowerShell and Data Factory module, you may run into deserialization errors while running the commands.
32
+
>If you don't use latest versions of PowerShell and Data Factory module, you could run into deserialization errors while running the commands.
33
33
34
-
#### Log in to PowerShell
34
+
#### Sign in to PowerShell
35
35
36
36
1. Launch **PowerShell** on your machine. Keep PowerShell open until the end of this quickstart. If you close and reopen, you need to run these commands again.
37
37
@@ -61,15 +61,15 @@ Install the latest Azure PowerShell modules by following instructions in [How to
61
61
$resourceGroupName = "ADFQuickStartRG";
62
62
```
63
63
64
-
If the resource group already exists, you may not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again
64
+
If the resource group already exists, you might not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again
65
65
66
66
2. To create the Azure resource group, run the following command:
If the resource group already exists, you may not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again.
72
+
If the resource group already exists, you might not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again.
73
73
74
74
3. Define a variable for the data factory name.
75
75
@@ -95,7 +95,7 @@ Note the following points:
95
95
The specified Data Factory name 'ADFv2QuickStartDataFactory' is already in use. Data Factory names must be globally unique.
96
96
```
97
97
98
-
* To create Data Factory instances, the user account you use to log in to Azure must be a member of **contributor** or **owner** roles, or an **administrator** of the Azure subscription.
98
+
* To create Data Factory instances, the user account you use to sign in to Azure must be a member of **contributor** or **owner** roles, or an **administrator** of the Azure subscription.
99
99
100
100
* For a list of Azure regions in which Data Factory is currently available, select the regions that interest you on the following page, and then expand **Analytics** to locate **Data Factory**: [Products available by region](https://azure.microsoft.com/global-infrastructure/services/). The data stores (Azure Storage, Azure SQL Database, etc.) and computes (HDInsight, etc.) used by data factory can be in other regions.
101
101
@@ -105,10 +105,10 @@ Note the following points:
105
105
Create linked services in a data factory to link your data stores and compute services to the data factory. In this quickstart, you create an Azure Storage linked service that is used as both the source and sink stores. The linked service has the connection information that the Data Factory service uses at runtime to connect to it.
106
106
107
107
>[!TIP]
108
-
>In this quickstart, you use *Account key* as the authentication type for your data store, but you can choose other supported authentication methods: *SAS URI*,*Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
108
+
>In this quickstart, you use *Account key* as the authentication type for your data store, but you can choose other supported authentication methods: *SAS URI*,*Service Principal, and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
109
109
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](./store-credentials-in-key-vault.md) for detailed illustrations.
110
110
111
-
1. Create a JSON file named **AzureStorageLinkedService.json** in **C:\ADFv2QuickStartPSH** folder with the following content: (Create the folder ADFv2QuickStartPSH if it does not already exist.).
111
+
1. Create a JSON file named **AzureStorageLinkedService.json** in **C:\ADFv2QuickStartPSH** folder with the following content: (Create the folder ADFv2QuickStartPSH if it doesn't already exist.).
112
112
113
113
> [!IMPORTANT]
114
114
> Replace <accountName> and <accountKey> with name and key of your Azure storage account before saving the file.
@@ -126,7 +126,7 @@ Create linked services in a data factory to link your data stores and compute se
126
126
}
127
127
```
128
128
129
-
If you are using Notepad, select **All files** for the **Save as type** filed in the **Save as** dialog box. Otherwise, it may add `.txt` extension to the file. For example, `AzureStorageLinkedService.json.txt`. If you create the file in File Explorer before opening it in Notepad, you may not see the `.txt` extension since the **Hide extensions for known files types** option is set by default. Remove the `.txt` extension before proceeding to the next step.
129
+
If you're using Notepad, select **All files** for the **Save as type** filed in the **Save as** dialog box. Otherwise, it might add `.txt` extension to the file. For example, `AzureStorageLinkedService.json.txt`. If you create the file in File Explorer before opening it in Notepad, you might not see the `.txt` extension since the **Hide extensions for known files types** option is set by default. Remove the `.txt` extension before proceeding to the next step.
130
130
131
131
2. In **PowerShell**, switch to the **ADFv2QuickStartPSH** folder.
132
132
@@ -142,7 +142,7 @@ Create linked services in a data factory to link your data stores and compute se
@@ -155,7 +155,8 @@ Create linked services in a data factory to link your data stores and compute se
155
155
156
156
In this procedure, you create two datasets: **InputDataset** and **OutputDataset**. These datasets are of type **Binary**. They refer to the Azure Storage linked service that you created in the previous section.
157
157
The input dataset represents the source data in the input folder. In the input dataset definition, you specify the blob container (**adftutorial**), the folder (**input**), and the file (**emp.txt**) that contain the source data.
158
-
The output dataset represents the data that's copied to the destination. In the output dataset definition, you specify the blob container (**adftutorial**), the folder (**output**), and the file to which the data is copied.
158
+
The output dataset represents the data that's copied to the destination. In the output dataset definition, you specify the blob container (**adftutorial**), the folder (**output**), and the file to which the data is copied.
159
+
159
160
1. Create a JSON file named **InputDataset.json** in the **C:\ADFv2QuickStartPSH** folder, with the following content:
160
161
161
162
```json
@@ -188,7 +189,7 @@ The output dataset represents the data that's copied to the destination. In the
188
189
-DefinitionFile ".\InputDataset.json"
189
190
```
190
191
191
-
Here is the sample output:
192
+
Here's the sample output:
192
193
193
194
```console
194
195
DatasetName : InputDataset
@@ -229,7 +230,7 @@ The output dataset represents the data that's copied to the destination. In the
0 commit comments