You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/tutorial-copy-data-portal.md
+16-20Lines changed: 16 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Use the Azure portal to create a data factory pipeline
3
3
description: This tutorial provides step-by-step instructions for using the Azure portal to create a data factory with a pipeline. The pipeline uses the copy activity to copy data from Azure Blob storage to Azure SQL Database.
4
4
author: jianleishen
5
5
ms.topic: tutorial
6
-
ms.date: 10/03/2024
6
+
ms.date: 04/25/2025
7
7
ms.subservice: data-movement
8
8
ms.author: jianleishen
9
9
---
@@ -28,6 +28,7 @@ In this tutorial, you perform the following steps:
28
28
> * Monitor the pipeline and activity runs.
29
29
30
30
## Prerequisites
31
+
31
32
***Azure subscription**. If you don't have an Azure subscription, create a [free Azure account](https://azure.microsoft.com/free/) before you begin.
32
33
***Azure storage account**. You use Blob storage as a *source* data store. If you don't have a storage account, see [Create an Azure storage account](../storage/common/storage-account-create.md) for steps to create one.
33
34
***Azure SQL Database**. You use the database as a *sink* data store. If you don't have a database in Azure SQL Database, see the [Create a database in Azure SQL Database](/azure/azure-sql/database/single-database-create-quickstart) for steps to create one.
@@ -38,15 +39,16 @@ Now, prepare your Blob storage and SQL database for the tutorial by performing t
38
39
39
40
#### Create a source blob
40
41
41
-
1. Launch Notepad. Copy the following text, and save it as an **emp.txt** file on your disk:
42
+
1. Launch Notepad. Copy the following text, and save it as an **emp.txt** file:
42
43
43
44
```
44
45
FirstName,LastName
45
46
John,Doe
46
47
Jane,Doe
47
48
```
48
49
49
-
1. Create a container named **adftutorial** in your Blob storage. Create a folder named **input** in this container. Then, upload the **emp.txt** file to the **input** folder. Use the Azure portal or tools such as [Azure Storage Explorer](https://storageexplorer.com/) to do these tasks.
50
+
1. Move that file into a folder called input.
51
+
1. Create a container named **adftutorial** in your Blob storage. Upload your **input** folder with the **emp.txt** file to this container. You can use the Azure portal or tools such as [Azure Storage Explorer](https://storageexplorer.com/) to do these tasks.
50
52
51
53
#### Create a sink SQL table
52
54
@@ -64,13 +66,14 @@ Now, prepare your Blob storage and SQL database for the tutorial by performing t
64
66
CREATE CLUSTERED INDEX IX_emp_ID ON dbo.emp (ID);
65
67
```
66
68
67
-
1. Allow Azure services to access SQL Server. Ensure that **Allow access to Azure services** is turned **ON** for your SQL Server so that Data Factory can write data to your SQL Server. To verify and turn on this setting, go to logical SQL server > Overview > Set server firewall> set the **Allow access to Azure services** option to **ON**.
69
+
1. Allow Azure services to access SQL Server. Ensure that **Allow access to Azure services** is turned **ON** for your SQL Server so that Data Factory can write data to your SQL Server. To verify and turn on this setting, go to your SQL Server in the Azure portal, select **Security** > **Networking** > enable **Selected networks**> chech **Allow Azure services and resources to access this server** under the **Exceptions**.
68
70
69
71
## Create a data factory
72
+
70
73
In this step, you create a data factory and start the Data Factory UI to create a pipeline in the data factory.
71
74
72
75
1. Open **Microsoft Edge** or **Google Chrome**. Currently, Data Factory UI is supported only in Microsoft Edge and Google Chrome web browsers.
73
-
2. On the left menu, select **Create a resource** > **Integration** > **Data Factory**.
76
+
2. On the left menu, select **Create a resource** > **Analytics** > **Data Factory**.
74
77
3. On the **Create Data Factory** page, under **Basics** tab, select the Azure **Subscription** in which you want to create the data factory.
75
78
4. For **Resource Group**, take one of the following steps:
76
79
@@ -79,28 +82,20 @@ In this step, you create a data factory and start the Data Factory UI to create
79
82
b. Select **Create new**, and enter the name of a new resource group.
80
83
81
84
To learn about resource groups, see [Use resource groups to manage your Azure resources](../azure-resource-manager/management/overview.md).
82
-
5. Under **Region**, select a location for the data factory. Only locations that are supported are displayed in the drop-down list. The data stores (for example, Azure Storage and SQL Database) and computes (for example, Azure HDInsight) used by the data factory can be in other regions.
83
-
6. Under **Name**, enter **ADFTutorialDataFactory**.
84
-
85
-
The name of the Azure data factory must be *globally unique*. If you receive an error message about the name value, enter a different name for the data factory. (for example, yournameADFTutorialDataFactory). For naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
85
+
5. Under **Region**, select a location for the data factory. Your data stores can be in a different region than your data factory, if they need to be.
86
+
6. Under **Name**, the name of the Azure data factory must be *globally unique*. If you receive an error message about the name value, enter a different name for the data factory. (for example, yournameADFDemo). For naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
86
87
87
88
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="New data factory error message for duplicate name.":::
88
89
89
90
7. Under **Version**, select **V2**.
90
91
8. Select **Git configuration** tab on the top, and select the **Configure Git later** check box.
91
92
9. Select **Review + create**, and select **Create** after the validation is passed.
92
93
10. After the creation is finished, you see the notice in Notifications center. Select **Go to resource** to navigate to the Data factory page.
93
-
11. Select **Open** on the **Open Azure Data Factory Studio** tile to launch the Azure Data Factory UI in a separate tab.
94
-
94
+
11. Select **Launch Studio** on the **Azure Data Factory Studio** tile.
95
95
96
96
## Create a pipeline
97
-
In this step, you create a pipeline with a copy activity in the data factory. The copy activity copies data from Blob storage to SQL Database. In the [Quickstart tutorial](quickstart-create-data-factory-portal.md), you created a pipeline by following these steps:
98
-
99
-
1. Create the linked service.
100
-
1. Create input and output datasets.
101
-
1. Create a pipeline.
102
97
103
-
In this tutorial, you start with creating the pipeline. Then you create linked services and datasets when you need them to configure the pipeline.
98
+
In this step, you create a pipeline with a copy activity in the data factory. The copy activity copies data from Blob storage to SQL Database.
104
99
105
100
1. On the home page, select **Orchestrate**.
106
101
@@ -122,7 +117,7 @@ In this tutorial, you start with creating the pipeline. Then you create linked s
122
117
123
118
1. In the **New Dataset** dialog box, select **Azure Blob Storage**, and then select **Continue**. The source data is in Blob storage, so you select **Azure Blob Storage** for the source dataset.
124
119
125
-
1. In the **Select Format** dialog box, choose the format type of your data, and then select **Continue**.
120
+
1. In the **Select Format** dialog box, choose **Delimited Text**, and then select **Continue**.
126
121
127
122
1. In the **Set Properties** dialog box, enter **SourceBlobDataset** for Name. Select the checkbox for **First row as header**. Under the **Linked service** text box, select **+ New**.
128
123
@@ -137,15 +132,16 @@ In this tutorial, you start with creating the pipeline. Then you create linked s
>In this tutorial, you use *SQL authentication* as the authentication type for your sink data store, but you can choose other supported authentication methods: *Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-sql-database.md#linked-service-properties) for details.
142
138
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](./store-credentials-in-key-vault.md) for detailed illustrations.
143
139
144
140
1. Go to the **Sink** tab, and select **+ New** to create a sink dataset.
145
141
146
-
1. In the **New Dataset** dialog box, input "SQL" in the search box to filter the connectors, select **Azure SQL Database**, and then select **Continue**. In this tutorial, you copy data to a SQL database.
142
+
1. In the **New Dataset** dialog box, input "SQL" in the search box to filter the connectors, select **Azure SQL Database**, and then select **Continue**.
147
143
148
-
1. In the **Set Properties** dialog box, enter **OutputSqlDataset** for Name. From the **Linked service** dropdown list, select **+ New**. A dataset must be associated with a linked service. The linked service has the connection string that Data Factory uses to connect to SQL Database at runtime. The dataset specifies the container, folder, and the file (optional) to which the data is copied.
144
+
1. In the **Set Properties** dialog box, enter **OutputSqlDataset** for Name. From the **Linked service** dropdown list, select **+ New**. A dataset must be associated with a linked service. The linked service has the connection string that Data Factory uses to connect to SQL Database at runtime, and specifies where the data will be copied to.
149
145
150
146
1. In the **New Linked Service (Azure SQL Database)** dialog box, take the following steps:
0 commit comments