Skip to content

Commit baaeb0f

Browse files
Merge pull request #257260 from WilliamDAssafMSFT/patch-2
Update tutorial-copy-data-tool.md
2 parents 10a1066 + 5ac04c8 commit baaeb0f

File tree

1 file changed

+27
-23
lines changed

1 file changed

+27
-23
lines changed

articles/data-factory/tutorial-copy-data-tool.md

Lines changed: 27 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,11 @@ ms.author: jianleishen
66
ms.service: data-factory
77
ms.subservice: tutorials
88
ms.topic: tutorial
9-
ms.custom: seo-lt-2019
10-
ms.date: 08/10/2023
9+
ms.date: 11/02/2023
1110
---
1211

1312
# Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool
1413

15-
1614
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
1715

1816
In this tutorial, you use the Azure portal to create a data factory. Then you use the Copy Data tool to create a pipeline that copies data from Azure Blob storage to a SQL Database.
@@ -32,6 +30,15 @@ In this tutorial, you perform the following steps:
3230
* **Azure Storage account**: Use Blob storage as the _source_ data store. If you don't have an Azure Storage account, see the instructions in [Create a storage account](../storage/common/storage-account-create.md).
3331
* **Azure SQL Database**: Use a SQL Database as the _sink_ data store. If you don't have a SQL Database, see the instructions in [Create a SQL Database](/azure/azure-sql/database/single-database-create-quickstart).
3432

33+
### Prepare the SQL database
34+
35+
Allow Azure services to access the logical SQL Server of your Azure SQL Database.
36+
37+
1. Verify that the setting **Allow Azure services and resources to access this server** is enabled for your server that's running SQL Database. This setting lets Data Factory write data to your database instance. To verify and turn on this setting, go to logical SQL server > Security > Firewalls and virtual networks > set the **Allow Azure services and resources to access this server** option to **ON**.
38+
39+
> [!NOTE]
40+
> The option to **Allow Azure services and resources to access this server** enables network access to your SQL Server from any Azure resource, not just those in your subscription. It may not be appropriate for all environments, but is appropriate for this limited tutorial. For more information, see [Azure SQL Server Firewall rules](/azure/azure-sql/database/firewall-configure). Instead, you can use [Private endpoints](../private-link/private-endpoint-overview.md) to connect to Azure PaaS services without using public IPs.
41+
3542
### Create a blob and a SQL table
3643

3744
Prepare your Blob storage and your SQL Database for the tutorial by performing these steps.
@@ -50,7 +57,7 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
5057

5158
#### Create a sink SQL table
5259

53-
1. Use the following SQL script to create a table named **dbo.emp** in your SQL Database:
60+
1. Use the following SQL script to create a table named `dbo.emp` in your SQL Database:
5461

5562
```sql
5663
CREATE TABLE dbo.emp
@@ -63,22 +70,18 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
6370
CREATE CLUSTERED INDEX IX_emp_ID ON dbo.emp (ID);
6471
```
6572

66-
2. Allow Azure services to access SQL Server. Verify that the setting **Allow Azure services and resources to access this server** is enabled for your server that's running SQL Database. This setting lets Data Factory write data to your database instance. To verify and turn on this setting, go to logical SQL server > Security > Firewalls and virtual networks > set the **Allow Azure services and resources to access this server** option to **ON**.
67-
68-
> [!NOTE]
69-
> The option to **Allow Azure services and resources to access this server** enables network access to your SQL Server from any Azure resource, not just those in your subscription. For more information, see [Azure SQL Server Firewall rules](/azure/azure-sql/database/firewall-configure). Instead, you can use [Private endpoints](../private-link/private-endpoint-overview.md) to connect to Azure PaaS services without using public IPs.
7073

7174
## Create a data factory
7275

7376
1. On the left menu, select **Create a resource** > **Integration** > **Data Factory**:
7477

75-
:::image type="content" source="./media/doc-common-process/new-azure-data-factory-menu.png" alt-text="New data factory creation":::
78+
:::image type="content" source="./media/doc-common-process/new-azure-data-factory-menu.png" alt-text="Screenshot of the New data factory creation.":::
7679

7780
1. On the **New data factory** page, under **Name**, enter **ADFTutorialDataFactory**.
7881

7982
The name for your data factory must be _globally unique_. You might receive the following error message:
8083

81-
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="New data factory error message for duplicate name.":::
84+
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="Screenshot of the New data factory error message for duplicate name.":::
8285

8386
If you receive an error message about the name value, enter a different name for the data factory. For example, use the name _**yourname**_**ADFTutorialDataFactory**. For the naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
8487

@@ -100,7 +103,7 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
100103

101104
1. After creation is finished, the **Data Factory** home page is displayed.
102105

103-
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
106+
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Screenshot of the Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
104107

105108
1. To launch the Azure Data Factory user interface (UI) in a separate tab, select **Open** on the **Open Azure Data Factory Studio** tile.
106109

@@ -112,7 +115,7 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
112115

113116
1. On the **Properties** page of the Copy Data tool, choose **Built-in copy task** under **Task type**, then select **Next**.
114117

115-
:::image type="content" source="./media/tutorial-copy-data-tool/copy-data-tool-properties-page.png" alt-text="Screenshot that shows the Properties page":::
118+
:::image type="content" source="./media/tutorial-copy-data-tool/copy-data-tool-properties-page.png" alt-text="Screenshot that shows the Properties page.":::
116119

117120
1. On the **Source data store** page, complete the following steps:
118121

@@ -128,11 +131,11 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
128131

129132
f. Select **Next** to move to next step.
130133

131-
:::image type="content" source="./media/tutorial-copy-data-tool/source-data-store.png" alt-text="Configure the source.":::
134+
:::image type="content" source="./media/tutorial-copy-data-tool/source-data-store.png" alt-text="Screenshot of the page to Configure the source.":::
132135

133136
1. On the **File format settings** page, enable the checkbox for *First row as header*. Notice that the tool automatically detects the column and row delimiters, and you can preview data and view the schema of the input data by selecting **Preview data** button on this page. Then select **Next**.
134137

135-
:::image type="content" source="./media/tutorial-copy-data-tool/file-format-settings-page.png" alt-text="File format settings":::
138+
:::image type="content" source="./media/tutorial-copy-data-tool/file-format-settings-page.png" alt-text="Screenshot of the File format settings.":::
136139

137140
1. On the **Destination data store** page, completes the following steps:
138141

@@ -142,41 +145,42 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
142145

143146
c. On the **New connection (Azure SQL Database)** page, select your Azure subscription, server name and database name from the dropdown list. Then select **SQL authentication** under **Authentication type**, specify the username and password. Test connection and select **Create**.
144147

145-
:::image type="content" source="./media/tutorial-copy-data-tool/config-azure-sql-db.png" alt-text="Configure Azure SQL DB":::
148+
:::image type="content" source="./media/tutorial-copy-data-tool/config-azure-sql-db.png" alt-text="Screenshot of the Configure Azure SQL Database page.":::
146149

147150
d. Select the newly created linked service as sink, then select **Next**.
148151

149-
1. On the **Destination data store** page, select **Use existing table** and select the **dbo.emp** table. Then select **Next**.
152+
1. On the **Destination data store** page, select **Use existing table** and select the `dbo.emp` table. Then select **Next**.
150153

151154
1. On the **Column mapping** page, notice that the second and the third columns in the input file are mapped to the **FirstName** and **LastName** columns of the **emp** table. Adjust the mapping to make sure that there is no error, and then select **Next**.
152155

153-
:::image type="content" source="./media/tutorial-copy-data-tool/column-mapping.png" alt-text="Column mapping page":::
156+
:::image type="content" source="./media/tutorial-copy-data-tool/column-mapping.png" alt-text="Screenshot of the column mapping page.":::
154157

155158
1. On the **Settings** page, under **Task name**, enter **CopyFromBlobToSqlPipeline**, and then select **Next**.
156159

157-
:::image type="content" source="./media/tutorial-copy-data-tool/settings.png" alt-text="Configure the settings.":::
160+
:::image type="content" source="./media/tutorial-copy-data-tool/settings.png" alt-text="Screenshot of the settings.":::
158161

159162
1. On the **Summary** page, review the settings, and then select **Next**.
160163

161164
1. On the **Deployment** page, select **Monitor** to monitor the pipeline (task).
162165

163-
:::image type="content" source="./media/tutorial-copy-data-tool/monitor-pipeline.png" alt-text="Monitor pipeline":::
166+
:::image type="content" source="./media/tutorial-copy-data-tool/monitor-pipeline.png" alt-text="Screenshot of Monitoring the pipeline.":::
164167

165168
1. On the Pipeline runs page, select **Refresh** to refresh the list. Select the link under **Pipeline name** to view activity run details or rerun the pipeline.
166169

167-
:::image type="content" source="./media/tutorial-copy-data-tool/pipeline-run.png" alt-text="Pipeline run":::
170+
:::image type="content" source="./media/tutorial-copy-data-tool/pipeline-run.png" alt-text="Screenshot of the Pipeline run.":::
168171

169172
1. On the "Activity runs" page, select the **Details** link (eyeglasses icon) under **Activity name** column for more details about copy operation. To go back to the "Pipeline runs" view, select the **All pipeline runs** link in the breadcrumb menu. To refresh the view, select **Refresh**.
170173

171-
:::image type="content" source="./media/tutorial-copy-data-tool/activity-monitoring.png" alt-text="Monitor activity runs":::
174+
:::image type="content" source="./media/tutorial-copy-data-tool/activity-monitoring.png" alt-text="Screenshot of monitoring activity runs.":::
172175

173176
1. Verify that the data is inserted into the **dbo.emp** table in your SQL Database.
174177

175178
1. Select the **Author** tab on the left to switch to the editor mode. You can update the linked services, datasets, and pipelines that were created via the tool by using the editor. For details on editing these entities in the Data Factory UI, see [the Azure portal version of this tutorial](tutorial-copy-data-portal.md).
176179

177-
:::image type="content" source="./media/tutorial-copy-data-tool/author-tab.png" alt-text="Select Author tab":::
180+
:::image type="content" source="./media/tutorial-copy-data-tool/author-tab.png" alt-text="Screenshot of the Select Author tab.":::
181+
182+
## Related content
178183

179-
## Next steps
180184
The pipeline in this sample copies data from Blob storage to a SQL Database. You learned how to:
181185

182186
> [!div class="checklist"]

0 commit comments

Comments
 (0)