|
1 |
| ---- |
| 1 | +--- |
2 | 2 | title: Continuous integration and delivery in Azure Data Factory
|
3 | 3 | description: Learn how to use continuous integration and delivery to move Data Factory pipelines from one environment (development, test, production) to another.
|
4 | 4 | services: data-factory
|
@@ -55,7 +55,7 @@ Below is a sample overview of the CI/CD lifecycle in an Azure data factory that'
|
55 | 55 |
|
56 | 56 | 
|
57 | 57 |
|
58 |
| -1. Select **Load file**, and then select the generated Resource Manager template. |
| 58 | +1. Select **Load file**, and then select the generated Resource Manager template. This is the **arm_template.json** file located in the .zip file exported in step 1. |
59 | 59 |
|
60 | 60 | 
|
61 | 61 |
|
@@ -166,17 +166,17 @@ There are two ways to handle secrets:
|
166 | 166 |
|
167 | 167 | The parameters file needs to be in the publish branch as well.
|
168 | 168 |
|
169 |
| -- Add an [Azure Key Vault task](https://docs.microsoft.com/azure/devops/pipelines/tasks/deploy/azure-key-vault) before the Azure Resource Manager Deployment task described in the previous section: |
| 169 | +1. Add an [Azure Key Vault task](https://docs.microsoft.com/azure/devops/pipelines/tasks/deploy/azure-key-vault) before the Azure Resource Manager Deployment task described in the previous section: |
170 | 170 |
|
171 | 171 | 1. On the **Tasks** tab, create a new task. Search for **Azure Key Vault** and add it.
|
172 | 172 |
|
173 | 173 | 1. In the Key Vault task, select the subscription in which you created the key vault. Provide credentials if necessary, and then select the key vault.
|
174 | 174 |
|
175 | 175 | 
|
176 | 176 |
|
177 |
| - #### Grant permissions to the Azure Pipelines agent |
| 177 | +#### Grant permissions to the Azure Pipelines agent |
178 | 178 |
|
179 |
| - The Azure Key Vault task might fail with an Access Denied error if the correct permissions aren't set. Download the logs for the release, and locate the .ps1 file that contains the command to give permissions to the Azure Pipelines agent. You can run the command directly. Or you can copy the principal ID from the file and add the access policy manually in the Azure portal. `Get` and `List` are the minimum permissions required. |
| 179 | +The Azure Key Vault task might fail with an Access Denied error if the correct permissions aren't set. Download the logs for the release, and locate the .ps1 file that contains the command to give permissions to the Azure Pipelines agent. You can run the command directly. Or you can copy the principal ID from the file and add the access policy manually in the Azure portal. `Get` and `List` are the minimum permissions required. |
180 | 180 |
|
181 | 181 | ### Update active triggers
|
182 | 182 |
|
@@ -466,7 +466,10 @@ If you're in GIT mode, you can override the default properties in your Resource
|
466 | 466 | * You use automated CI/CD and you want to change some properties during Resource Manager deployment, but the properties aren't parameterized by default.
|
467 | 467 | * Your factory is so large that the default Resource Manager template is invalid because it has more than the maximum allowed parameters (256).
|
468 | 468 |
|
469 |
| -Under these conditions, to override the default parameterization template, create a file named arm-template-parameters-definition.json in the folder specified as the root folder for the data factory git integration. You must use that exact file name. Data Factory reads this file from whichever branch you're currently on in the Azure Data Factory portal, not just from the collaboration branch. You can create or edit the file from a private branch, where you can test your changes by selecting **Export ARM Template** in the UI. You can then merge the file into the collaboration branch. If no file is found, the default template is used. |
| 469 | +Under these conditions, to override the default parameterization template, create a file named **arm-template-parameters-definition.json** in the folder specified as the root folder for the data factory git integration. You must use that exact file name. Data Factory reads this file from whichever branch you're currently on in the Azure Data Factory portal, not just from the collaboration branch. You can create or edit the file from a private branch, where you can test your changes by selecting **Export ARM Template** in the UI. You can then merge the file into the collaboration branch. If no file is found, the default template is used. |
| 470 | + |
| 471 | +> [!NOTE] |
| 472 | +> A custom parameterization template doesn't change the ARM template parameter limit of 256. It lets you choose and decrease the number of parameterized properties. |
470 | 473 |
|
471 | 474 | ### Syntax of a custom parameters file
|
472 | 475 |
|
@@ -652,7 +655,7 @@ Following is the current default parameterization template. If you need to add o
|
652 | 655 | "database": "=",
|
653 | 656 | "serviceEndpoint": "=",
|
654 | 657 | "batchUri": "=",
|
655 |
| - "poolName": "=", |
| 658 | + "poolName": "=", |
656 | 659 | "databaseName": "=",
|
657 | 660 | "systemNumber": "=",
|
658 | 661 | "server": "=",
|
|
0 commit comments