You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this tutorial, you use the Azure portal to create a data factory. Then you use the Copy Data tool to create a pipeline that copies data from Azure Blob storage to a SQL Database.
@@ -32,6 +30,15 @@ In this tutorial, you perform the following steps:
32
30
***Azure Storage account**: Use Blob storage as the _source_ data store. If you don't have an Azure Storage account, see the instructions in [Create a storage account](../storage/common/storage-account-create.md).
33
31
***Azure SQL Database**: Use a SQL Database as the _sink_ data store. If you don't have a SQL Database, see the instructions in [Create a SQL Database](/azure/azure-sql/database/single-database-create-quickstart).
34
32
33
+
### Prepare the SQL database
34
+
35
+
Allow Azure services to access the logical SQL Server of your Azure SQL Database.
36
+
37
+
1. Verify that the setting **Allow Azure services and resources to access this server** is enabled for your server that's running SQL Database. This setting lets Data Factory write data to your database instance. To verify and turn on this setting, go to logical SQL server > Security > Firewalls and virtual networks > set the **Allow Azure services and resources to access this server** option to **ON**.
38
+
39
+
> [!NOTE]
40
+
> The option to **Allow Azure services and resources to access this server** enables network access to your SQL Server from any Azure resource, not just those in your subscription. It may not be appropriate for all environments, but is appropriate for this limited tutorial. For more information, see [Azure SQL Server Firewall rules](/azure/azure-sql/database/firewall-configure). Instead, you can use [Private endpoints](../private-link/private-endpoint-overview.md) to connect to Azure PaaS services without using public IPs.
41
+
35
42
### Create a blob and a SQL table
36
43
37
44
Prepare your Blob storage and your SQL Database for the tutorial by performing these steps.
@@ -50,7 +57,7 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
50
57
51
58
#### Create a sink SQL table
52
59
53
-
1. Use the following SQL script to create a table named **dbo.emp** in your SQL Database:
60
+
1. Use the following SQL script to create a table named `dbo.emp` in your SQL Database:
54
61
55
62
```sql
56
63
CREATETABLEdbo.emp
@@ -63,22 +70,18 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
63
70
CREATE CLUSTERED INDEX IX_emp_ID ONdbo.emp (ID);
64
71
```
65
72
66
-
2. Allow Azure services to access SQL Server. Verify that the setting **Allow Azure services and resources to access this server** is enabled for your server that's running SQL Database. This setting lets Data Factory write data to your database instance. To verify and turn on this setting, go to logical SQL server > Security > Firewalls and virtual networks > set the **Allow Azure services and resources to access this server** option to **ON**.
67
-
68
-
> [!NOTE]
69
-
> The option to **Allow Azure services and resources to access this server** enables network access to your SQL Server from any Azure resource, not just those in your subscription. For more information, see [Azure SQL Server Firewall rules](/azure/azure-sql/database/firewall-configure). Instead, you can use [Private endpoints](../private-link/private-endpoint-overview.md) to connect to Azure PaaS services without using public IPs.
70
73
71
74
## Create a data factory
72
75
73
76
1. On the left menu, select **Create a resource** > **Integration** > **Data Factory**:
74
77
75
-
:::image type="content" source="./media/doc-common-process/new-azure-data-factory-menu.png" alt-text="New data factory creation":::
78
+
:::image type="content" source="./media/doc-common-process/new-azure-data-factory-menu.png" alt-text="Screenshot of the New data factory creation.":::
76
79
77
80
1. On the **New data factory** page, under **Name**, enter **ADFTutorialDataFactory**.
78
81
79
82
The name for your data factory must be _globally unique_. You might receive the following error message:
80
83
81
-
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="New data factory error message for duplicate name.":::
84
+
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="Screenshot of the New data factory error message for duplicate name.":::
82
85
83
86
If you receive an error message about the name value, enter a different name for the data factory. For example, use the name _**yourname**_**ADFTutorialDataFactory**. For the naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
84
87
@@ -100,7 +103,7 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
100
103
101
104
1. After creation is finished, the **Data Factory** home page is displayed.
102
105
103
-
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
106
+
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Screenshot of the Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
104
107
105
108
1. To launch the Azure Data Factory user interface (UI) in a separate tab, select **Open** on the **Open Azure Data Factory Studio** tile.
106
109
@@ -112,7 +115,7 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
112
115
113
116
1. On the **Properties** page of the Copy Data tool, choose **Built-in copy task** under **Task type**, then select **Next**.
114
117
115
-
:::image type="content" source="./media/tutorial-copy-data-tool/copy-data-tool-properties-page.png" alt-text="Screenshot that shows the Properties page":::
118
+
:::image type="content" source="./media/tutorial-copy-data-tool/copy-data-tool-properties-page.png" alt-text="Screenshot that shows the Properties page.":::
116
119
117
120
1. On the **Source data store** page, complete the following steps:
118
121
@@ -128,11 +131,11 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
128
131
129
132
f. Select **Next** to move to next step.
130
133
131
-
:::image type="content" source="./media/tutorial-copy-data-tool/source-data-store.png" alt-text="Configure the source.":::
134
+
:::image type="content" source="./media/tutorial-copy-data-tool/source-data-store.png" alt-text="Screenshot of the page to Configure the source.":::
132
135
133
136
1. On the **File format settings** page, enable the checkbox for *First row as header*. Notice that the tool automatically detects the column and row delimiters, and you can preview data and view the schema of the input data by selecting **Preview data** button on this page. Then select **Next**.
134
137
135
-
:::image type="content" source="./media/tutorial-copy-data-tool/file-format-settings-page.png" alt-text="File format settings":::
138
+
:::image type="content" source="./media/tutorial-copy-data-tool/file-format-settings-page.png" alt-text="Screenshot of the File format settings.":::
136
139
137
140
1. On the **Destination data store** page, completes the following steps:
138
141
@@ -142,41 +145,42 @@ Prepare your Blob storage and your SQL Database for the tutorial by performing t
142
145
143
146
c. On the **New connection (Azure SQL Database)** page, select your Azure subscription, server name and database name from the dropdown list. Then select **SQL authentication** under **Authentication type**, specify the username and password. Test connection and select **Create**.
:::image type="content" source="./media/tutorial-copy-data-tool/config-azure-sql-db.png" alt-text="Screenshot of the Configure Azure SQL Database page.":::
146
149
147
150
d. Select the newly created linked service as sink, then select **Next**.
148
151
149
-
1. On the **Destination data store** page, select **Use existing table** and select the **dbo.emp** table. Then select **Next**.
152
+
1. On the **Destination data store** page, select **Use existing table** and select the `dbo.emp` table. Then select **Next**.
150
153
151
154
1. On the **Column mapping** page, notice that the second and the third columns in the input file are mapped to the **FirstName** and **LastName** columns of the **emp** table. Adjust the mapping to make sure that there is no error, and then select **Next**.
:::image type="content" source="./media/tutorial-copy-data-tool/monitor-pipeline.png" alt-text="Screenshot of Monitoring the pipeline.":::
164
167
165
168
1. On the Pipeline runs page, select **Refresh** to refresh the list. Select the link under **Pipeline name** to view activity run details or rerun the pipeline.
:::image type="content" source="./media/tutorial-copy-data-tool/pipeline-run.png" alt-text="Screenshot of the Pipeline run.":::
168
171
169
172
1. On the "Activity runs" page, select the **Details** link (eyeglasses icon) under **Activity name** column for more details about copy operation. To go back to the "Pipeline runs" view, select the **All pipeline runs** link in the breadcrumb menu. To refresh the view, select **Refresh**.
:::image type="content" source="./media/tutorial-copy-data-tool/activity-monitoring.png" alt-text="Screenshot of monitoring activity runs.":::
172
175
173
176
1. Verify that the data is inserted into the **dbo.emp** table in your SQL Database.
174
177
175
178
1. Select the **Author** tab on the left to switch to the editor mode. You can update the linked services, datasets, and pipelines that were created via the tool by using the editor. For details on editing these entities in the Data Factory UI, see [the Azure portal version of this tutorial](tutorial-copy-data-portal.md).
0 commit comments