Skip to content

Commit 5584330

Browse files
committed
Initial edit pass
1 parent 41f8d69 commit 5584330

File tree

1 file changed

+16
-20
lines changed

1 file changed

+16
-20
lines changed

articles/data-factory/tutorial-copy-data-portal.md

Lines changed: 16 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Use the Azure portal to create a data factory pipeline
33
description: This tutorial provides step-by-step instructions for using the Azure portal to create a data factory with a pipeline. The pipeline uses the copy activity to copy data from Azure Blob storage to Azure SQL Database.
44
author: jianleishen
55
ms.topic: tutorial
6-
ms.date: 10/03/2024
6+
ms.date: 04/25/2025
77
ms.subservice: data-movement
88
ms.author: jianleishen
99
---
@@ -28,6 +28,7 @@ In this tutorial, you perform the following steps:
2828
> * Monitor the pipeline and activity runs.
2929
3030
## Prerequisites
31+
3132
* **Azure subscription**. If you don't have an Azure subscription, create a [free Azure account](https://azure.microsoft.com/free/) before you begin.
3233
* **Azure storage account**. You use Blob storage as a *source* data store. If you don't have a storage account, see [Create an Azure storage account](../storage/common/storage-account-create.md) for steps to create one.
3334
* **Azure SQL Database**. You use the database as a *sink* data store. If you don't have a database in Azure SQL Database, see the [Create a database in Azure SQL Database](/azure/azure-sql/database/single-database-create-quickstart) for steps to create one.
@@ -38,15 +39,16 @@ Now, prepare your Blob storage and SQL database for the tutorial by performing t
3839

3940
#### Create a source blob
4041

41-
1. Launch Notepad. Copy the following text, and save it as an **emp.txt** file on your disk:
42+
1. Launch Notepad. Copy the following text, and save it as an **emp.txt** file:
4243

4344
```
4445
FirstName,LastName
4546
John,Doe
4647
Jane,Doe
4748
```
4849
49-
1. Create a container named **adftutorial** in your Blob storage. Create a folder named **input** in this container. Then, upload the **emp.txt** file to the **input** folder. Use the Azure portal or tools such as [Azure Storage Explorer](https://storageexplorer.com/) to do these tasks.
50+
1. Move that file into a folder called input.
51+
1. Create a container named **adftutorial** in your Blob storage. Upload your **input** folder with the **emp.txt** file to this container. You can use the Azure portal or tools such as [Azure Storage Explorer](https://storageexplorer.com/) to do these tasks.
5052
5153
#### Create a sink SQL table
5254
@@ -64,13 +66,14 @@ Now, prepare your Blob storage and SQL database for the tutorial by performing t
6466
CREATE CLUSTERED INDEX IX_emp_ID ON dbo.emp (ID);
6567
```
6668
67-
1. Allow Azure services to access SQL Server. Ensure that **Allow access to Azure services** is turned **ON** for your SQL Server so that Data Factory can write data to your SQL Server. To verify and turn on this setting, go to logical SQL server > Overview > Set server firewall> set the **Allow access to Azure services** option to **ON**.
69+
1. Allow Azure services to access SQL Server. Ensure that **Allow access to Azure services** is turned **ON** for your SQL Server so that Data Factory can write data to your SQL Server. To verify and turn on this setting, go to your SQL Server in the Azure portal, select **Security** > **Networking** > enable **Selected networks**> chech **Allow Azure services and resources to access this server** under the **Exceptions**.
6870
6971
## Create a data factory
72+
7073
In this step, you create a data factory and start the Data Factory UI to create a pipeline in the data factory.
7174
7275
1. Open **Microsoft Edge** or **Google Chrome**. Currently, Data Factory UI is supported only in Microsoft Edge and Google Chrome web browsers.
73-
2. On the left menu, select **Create a resource** > **Integration** > **Data Factory**.
76+
2. On the left menu, select **Create a resource** > **Analytics** > **Data Factory**.
7477
3. On the **Create Data Factory** page, under **Basics** tab, select the Azure **Subscription** in which you want to create the data factory.
7578
4. For **Resource Group**, take one of the following steps:
7679
@@ -79,28 +82,20 @@ In this step, you create a data factory and start the Data Factory UI to create
7982
b. Select **Create new**, and enter the name of a new resource group.
8083
8184
To learn about resource groups, see [Use resource groups to manage your Azure resources](../azure-resource-manager/management/overview.md).
82-
5. Under **Region**, select a location for the data factory. Only locations that are supported are displayed in the drop-down list. The data stores (for example, Azure Storage and SQL Database) and computes (for example, Azure HDInsight) used by the data factory can be in other regions.
83-
6. Under **Name**, enter **ADFTutorialDataFactory**.
84-
85-
The name of the Azure data factory must be *globally unique*. If you receive an error message about the name value, enter a different name for the data factory. (for example, yournameADFTutorialDataFactory). For naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
85+
5. Under **Region**, select a location for the data factory. Your data stores can be in a different region than your data factory, if they need to be.
86+
6. Under **Name**, the name of the Azure data factory must be *globally unique*. If you receive an error message about the name value, enter a different name for the data factory. (for example, yournameADFDemo). For naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
8687
8788
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="New data factory error message for duplicate name.":::
8889
8990
7. Under **Version**, select **V2**.
9091
8. Select **Git configuration** tab on the top, and select the **Configure Git later** check box.
9192
9. Select **Review + create**, and select **Create** after the validation is passed.
9293
10. After the creation is finished, you see the notice in Notifications center. Select **Go to resource** to navigate to the Data factory page.
93-
11. Select **Open** on the **Open Azure Data Factory Studio** tile to launch the Azure Data Factory UI in a separate tab.
94-
94+
11. Select **Launch Studio** on the **Azure Data Factory Studio** tile.
9595
9696
## Create a pipeline
97-
In this step, you create a pipeline with a copy activity in the data factory. The copy activity copies data from Blob storage to SQL Database. In the [Quickstart tutorial](quickstart-create-data-factory-portal.md), you created a pipeline by following these steps:
98-
99-
1. Create the linked service.
100-
1. Create input and output datasets.
101-
1. Create a pipeline.
10297
103-
In this tutorial, you start with creating the pipeline. Then you create linked services and datasets when you need them to configure the pipeline.
98+
In this step, you create a pipeline with a copy activity in the data factory. The copy activity copies data from Blob storage to SQL Database.
10499
105100
1. On the home page, select **Orchestrate**.
106101
@@ -122,7 +117,7 @@ In this tutorial, you start with creating the pipeline. Then you create linked s
122117
123118
1. In the **New Dataset** dialog box, select **Azure Blob Storage**, and then select **Continue**. The source data is in Blob storage, so you select **Azure Blob Storage** for the source dataset.
124119
125-
1. In the **Select Format** dialog box, choose the format type of your data, and then select **Continue**.
120+
1. In the **Select Format** dialog box, choose **Delimited Text**, and then select **Continue**.
126121
127122
1. In the **Set Properties** dialog box, enter **SourceBlobDataset** for Name. Select the checkbox for **First row as header**. Under the **Linked service** text box, select **+ New**.
128123
@@ -137,15 +132,16 @@ In this tutorial, you start with creating the pipeline. Then you create linked s
137132
:::image type="content" source="./media/tutorial-copy-data-portal/source-dataset-selected.png" alt-text="Source dataset":::
138133
139134
### Configure sink
135+
140136
>[!TIP]
141137
>In this tutorial, you use *SQL authentication* as the authentication type for your sink data store, but you can choose other supported authentication methods: *Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-sql-database.md#linked-service-properties) for details.
142138
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](./store-credentials-in-key-vault.md) for detailed illustrations.
143139
144140
1. Go to the **Sink** tab, and select **+ New** to create a sink dataset.
145141
146-
1. In the **New Dataset** dialog box, input "SQL" in the search box to filter the connectors, select **Azure SQL Database**, and then select **Continue**. In this tutorial, you copy data to a SQL database.
142+
1. In the **New Dataset** dialog box, input "SQL" in the search box to filter the connectors, select **Azure SQL Database**, and then select **Continue**.
147143
148-
1. In the **Set Properties** dialog box, enter **OutputSqlDataset** for Name. From the **Linked service** dropdown list, select **+ New**. A dataset must be associated with a linked service. The linked service has the connection string that Data Factory uses to connect to SQL Database at runtime. The dataset specifies the container, folder, and the file (optional) to which the data is copied.
144+
1. In the **Set Properties** dialog box, enter **OutputSqlDataset** for Name. From the **Linked service** dropdown list, select **+ New**. A dataset must be associated with a linked service. The linked service has the connection string that Data Factory uses to connect to SQL Database at runtime, and specifies where the data will be copied to.
149145
150146
1. In the **New Linked Service (Azure SQL Database)** dialog box, take the following steps:
151147

0 commit comments

Comments
 (0)