Skip to content

Commit 9b13269

Browse files
committed
Review powershell quickstart
Checked steps, links, acrolinx. Tutorial is thorough and works as written.
1 parent 0fa6caf commit 9b13269

File tree

1 file changed

+17
-16
lines changed

1 file changed

+17
-16
lines changed

articles/data-factory/quickstart-create-data-factory-powershell.md

Lines changed: 17 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,20 @@ author: whhender
55
ms.subservice: data-movement
66
ms.devlang: powershell
77
ms.topic: quickstart
8-
ms.date: 05/15/2024
8+
ms.date: 06/06/2025
99
ms.author: whhender
1010
ms.reviewer: jianleishen
1111
ms.custom: devx-track-azurepowershell, mode-api
1212
---
1313
# Quickstart: Create an Azure Data Factory using PowerShell
1414

15-
1615
[!INCLUDE[appliesto-adf-xxx-md](includes/appliesto-adf-xxx-md.md)]
1716

1817
This quickstart describes how to use PowerShell to create an Azure Data Factory. The pipeline you create in this data factory **copies** data from one folder to another folder in an Azure blob storage. For a tutorial on how to **transform** data using Azure Data Factory, see [Tutorial: Transform data using Spark](transform-data-using-spark.md).
1918

2019
> [!NOTE]
21-
> This article does not provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
20+
> This article doesn't provide a detailed introduction of the Data Factory service. For an introduction to the Azure Data Factory service, see [Introduction to Azure Data Factory](introduction.md).
21+
2222

2323
[!INCLUDE [data-factory-quickstart-prerequisites](includes/data-factory-quickstart-prerequisites.md)]
2424

@@ -29,9 +29,9 @@ This quickstart describes how to use PowerShell to create an Azure Data Factory.
2929
Install the latest Azure PowerShell modules by following instructions in [How to install and configure Azure PowerShell](/powershell/azure/install-azure-powershell).
3030

3131
>[!WARNING]
32-
>If you do not use latest versions of PowerShell and Data Factory module, you may run into deserialization errors while running the commands.
32+
>If you don't use latest versions of PowerShell and Data Factory module, you could run into deserialization errors while running the commands.
3333
34-
#### Log in to PowerShell
34+
#### Sign in to PowerShell
3535

3636
1. Launch **PowerShell** on your machine. Keep PowerShell open until the end of this quickstart. If you close and reopen, you need to run these commands again.
3737

@@ -61,15 +61,15 @@ Install the latest Azure PowerShell modules by following instructions in [How to
6161
$resourceGroupName = "ADFQuickStartRG";
6262
```
6363
64-
If the resource group already exists, you may not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again
64+
If the resource group already exists, you might not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again
6565
6666
2. To create the Azure resource group, run the following command:
6767
6868
```powershell
6969
$ResGrp = New-AzResourceGroup $resourceGroupName -location 'East US'
7070
```
7171
72-
If the resource group already exists, you may not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again.
72+
If the resource group already exists, you might not want to overwrite it. Assign a different value to the `$ResourceGroupName` variable and run the command again.
7373
7474
3. Define a variable for the data factory name.
7575
@@ -95,7 +95,7 @@ Note the following points:
9595
The specified Data Factory name 'ADFv2QuickStartDataFactory' is already in use. Data Factory names must be globally unique.
9696
```
9797
98-
* To create Data Factory instances, the user account you use to log in to Azure must be a member of **contributor** or **owner** roles, or an **administrator** of the Azure subscription.
98+
* To create Data Factory instances, the user account you use to sign in to Azure must be a member of **contributor** or **owner** roles, or an **administrator** of the Azure subscription.
9999
100100
* For a list of Azure regions in which Data Factory is currently available, select the regions that interest you on the following page, and then expand **Analytics** to locate **Data Factory**: [Products available by region](https://azure.microsoft.com/global-infrastructure/services/). The data stores (Azure Storage, Azure SQL Database, etc.) and computes (HDInsight, etc.) used by data factory can be in other regions.
101101
@@ -105,10 +105,10 @@ Note the following points:
105105
Create linked services in a data factory to link your data stores and compute services to the data factory. In this quickstart, you create an Azure Storage linked service that is used as both the source and sink stores. The linked service has the connection information that the Data Factory service uses at runtime to connect to it.
106106
107107
>[!TIP]
108-
>In this quickstart, you use *Account key* as the authentication type for your data store, but you can choose other supported authentication methods: *SAS URI*,*Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
108+
>In this quickstart, you use *Account key* as the authentication type for your data store, but you can choose other supported authentication methods: *SAS URI*, *Service Principal, and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
109109
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](./store-credentials-in-key-vault.md) for detailed illustrations.
110110
111-
1. Create a JSON file named **AzureStorageLinkedService.json** in **C:\ADFv2QuickStartPSH** folder with the following content: (Create the folder ADFv2QuickStartPSH if it does not already exist.).
111+
1. Create a JSON file named **AzureStorageLinkedService.json** in **C:\ADFv2QuickStartPSH** folder with the following content: (Create the folder ADFv2QuickStartPSH if it doesn't already exist.).
112112
113113
> [!IMPORTANT]
114114
> Replace <accountName> and <accountKey> with name and key of your Azure storage account before saving the file.
@@ -126,7 +126,7 @@ Create linked services in a data factory to link your data stores and compute se
126126
}
127127
```
128128
129-
If you are using Notepad, select **All files** for the **Save as type** filed in the **Save as** dialog box. Otherwise, it may add `.txt` extension to the file. For example, `AzureStorageLinkedService.json.txt`. If you create the file in File Explorer before opening it in Notepad, you may not see the `.txt` extension since the **Hide extensions for known files types** option is set by default. Remove the `.txt` extension before proceeding to the next step.
129+
If you're using Notepad, select **All files** for the **Save as type** filed in the **Save as** dialog box. Otherwise, it might add `.txt` extension to the file. For example, `AzureStorageLinkedService.json.txt`. If you create the file in File Explorer before opening it in Notepad, you might not see the `.txt` extension since the **Hide extensions for known files types** option is set by default. Remove the `.txt` extension before proceeding to the next step.
130130
131131
2. In **PowerShell**, switch to the **ADFv2QuickStartPSH** folder.
132132
@@ -142,7 +142,7 @@ Create linked services in a data factory to link your data stores and compute se
142142
-DefinitionFile ".\AzureStorageLinkedService.json"
143143
```
144144
145-
Here is the sample output:
145+
Here's the sample output:
146146
147147
```console
148148
LinkedServiceName : AzureStorageLinkedService
@@ -155,7 +155,8 @@ Create linked services in a data factory to link your data stores and compute se
155155
156156
In this procedure, you create two datasets: **InputDataset** and **OutputDataset**. These datasets are of type **Binary**. They refer to the Azure Storage linked service that you created in the previous section.
157157
The input dataset represents the source data in the input folder. In the input dataset definition, you specify the blob container (**adftutorial**), the folder (**input**), and the file (**emp.txt**) that contain the source data.
158-
The output dataset represents the data that's copied to the destination. In the output dataset definition, you specify the blob container (**adftutorial**), the folder (**output**), and the file to which the data is copied.
158+
The output dataset represents the data that's copied to the destination. In the output dataset definition, you specify the blob container (**adftutorial**), the folder (**output**), and the file to which the data is copied.
159+
159160
1. Create a JSON file named **InputDataset.json** in the **C:\ADFv2QuickStartPSH** folder, with the following content:
160161
161162
```json
@@ -188,7 +189,7 @@ The output dataset represents the data that's copied to the destination. In the
188189
-DefinitionFile ".\InputDataset.json"
189190
```
190191
191-
Here is the sample output:
192+
Here's the sample output:
192193
193194
```console
194195
DatasetName : InputDataset
@@ -229,7 +230,7 @@ The output dataset represents the data that's copied to the destination. In the
229230
-DefinitionFile ".\OutputDataset.json"
230231
```
231232
232-
Here is the sample output:
233+
Here's the sample output:
233234
234235
```console
235236
DatasetName : OutputDataset
@@ -343,7 +344,7 @@ $RunId = Invoke-AzDataFactoryV2Pipeline `
343344
}
344345
```
345346
346-
Here is the sample output of pipeline run:
347+
Here's the sample output of pipeline run:
347348
348349
```console
349350
Pipeline is running...status: InProgress

0 commit comments

Comments
 (0)