You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/tutorial-hybrid-copy-portal.md
+23-14Lines changed: 23 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
10
10
ms.workload: data-services
11
11
ms.topic: tutorial
12
12
ms.custom: seo-lt-2019; seo-dt-2019
13
-
ms.date: 03/11/2020
13
+
ms.date: 03/12/2020
14
14
---
15
15
16
16
# Copy data from an on-premises SQL Server database to Azure Blob storage
@@ -153,14 +153,14 @@ In this step, you create a data factory and start the Data Factory UI to create
153
153
154
154
1. Under name, enter **TutorialIntegrationRuntime**. Then select **Create**.
155
155
156
-
1. For Settings, select **Click here to launch the express setup for this computer**.This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
156
+
1. For Settings, select **Click here to launch the express setup for this computer**.This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
1. In the **New Linked Service** dialog box, confirm that **TutorialIntegrationRuntime** is selected under **Connect via integration runtime**. Then, take the following steps:
163
+
1. In the **New linked service (SQL Server)** dialog box, confirm that **TutorialIntegrationRuntime** is selected under **Connect via integration runtime**. Then, take the following steps:
164
164
165
165
a. Under **Name**, enter **SqlServerLinkedService**.
166
166
@@ -174,13 +174,17 @@ In this step, you create a data factory and start the Data Factory UI to create
174
174
175
175
f. Select **Test connection**. This step is to confirm that Data Factory can connect to your SQL Server database by using the self-hosted integration runtime you created.
176
176
177
-
g. To save the linked service, select **Finish**.
177
+
g. To save the linked service, select **Create**.
178
+
179
+

178
180
179
-
1.You should be back in the window with the source dataset opened. On the **Connection**tab of the **Properties** window, take the following steps:
181
+
1.After the linked service is created, you're back to the **Set properties**page for the SqlServerDataset. Take the following steps:
180
182
181
183
a. In **Linked service**, confirm that you see **SqlServerLinkedService**.
182
184
183
-
b. In **Table**, select **[dbo].[emp]**.
185
+
b. Under **Table name**, select **[dbo].[emp]**.
186
+
187
+
c. Select **OK**.
184
188
185
189
1. Go to the tab with **SQLServerToBlobPipeline**, or select **SQLServerToBlobPipeline** in the tree view.
186
190
@@ -194,10 +198,11 @@ In this step, you create a data factory and start the Data Factory UI to create
194
198
195
199
1. In the **Set Properties** dialog box, enter **AzureBlobDataset** for Name. Next to the **Linked service** text box, select **+ New**.
196
200
197
-
1. In the **New Linked Service (Azure Blob Storage)** dialog box, enter **AzureStorageLinkedService** as name, select your storage account from the **Storage account** name list. Test connection, and then select **Finish** to deploy the linked service.
198
-
1. After the linked service is created, you're back to the **Set properties** page. Select **Continue**.
201
+
1. In the **New Linked Service (Azure Blob Storage)** dialog box, enter **AzureStorageLinkedService** as name, select your storage account from the **Storage account** name list. Test connection, and then select **Create** to deploy the linked service.
199
202
200
-
1. You should be back in the window with the sink dataset open. On the **Connection** tab, take the following steps:
203
+
1. After the linked service is created, you're back to the **Set properties** page. Select **OK**.
204
+
205
+
1. Open the sink dataset. On the **Connection** tab, take the following steps:
201
206
202
207
a. In **Linked service**, confirm that **AzureStorageLinkedService** is selected.
203
208
@@ -210,11 +215,13 @@ In this step, you create a data factory and start the Data Factory UI to create
210
215
211
216
1. Go to the tab with the pipeline opened, or select the pipeline in the tree view. In **Sink Dataset**, confirm that **AzureBlobDataset** is selected.
212
217
213
-
1. To validate the pipeline settings, select **Validate** on the toolbar for the pipeline. To close the **Pipe Validation Report**, select **Close**.
218
+
1. To validate the pipeline settings, select **Validate** on the toolbar for the pipeline. To close the **Pipe validation output**, select the **>>** icon.
1. To publish entities you created to Data Factory, select **Publish All**.
222
+
1. To publish entities you created to Data Factory, select **Publish all**.
216
223
217
-
1. Wait until you see the **Publishing succeeded** pop-up. To check the status of publishing, select the **Show Notifications** link on the top of the window. To close the notification window, select **Close**.
224
+
1. Wait until you see the **Publishing completed** pop-up. To check the status of publishing, select the **Show notifications** link on the top of the window. To close the notification window, select **Close**.
218
225
219
226
220
227
## Trigger a pipeline run
@@ -224,8 +231,10 @@ Select **Add Trigger** on the toolbar for the pipeline, and then select **Trigge
224
231
225
232
1. Go to the **Monitor** tab. You see the pipeline that you manually triggered in the previous step.
226
233
234
+
1. To view activity runs associated with the pipeline run, select the **SQLServerToBlobPipeline** link under *PIPELINE NAME*.
1. To view activity runs associated with the pipeline run, select the **View Activity Runs** link in the **Actions** column. You see only activity runs because there's only one activity in the pipeline. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To go back to the Pipeline Runs view, select **Pipeline Runs** at the top.
236
+
237
+
1. On the **Activity runs** page, select the Details (eyeglasses image) link to see details about the copy operation. To go back to the Pipeline Runs view, select **All pipeline runs** at the top.
229
238
230
239
## Verify the output
231
240
The pipeline automatically creates the output folder named *fromonprem* in the `adftutorial` blob container. Confirm that you see the *[pipeline().RunId].txt* file in the output folder.
0 commit comments