Skip to content

Commit 0a3418f

Browse files
committed
Refreshed according to UI changes
1 parent b1f0d2f commit 0a3418f

File tree

5 files changed

+23
-14
lines changed

5 files changed

+23
-14
lines changed
-10.6 KB
Loading
68.4 KB
Loading
-23.3 KB
Loading
19.8 KB
Loading

articles/data-factory/tutorial-hybrid-copy-portal.md

Lines changed: 23 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: tutorial
1212
ms.custom: seo-lt-2019; seo-dt-2019
13-
ms.date: 03/11/2020
13+
ms.date: 03/12/2020
1414
---
1515

1616
# Copy data from an on-premises SQL Server database to Azure Blob storage
@@ -153,14 +153,14 @@ In this step, you create a data factory and start the Data Factory UI to create
153153

154154
1. Under name, enter **TutorialIntegrationRuntime**. Then select **Create**.
155155

156-
1. For Settings, select **Click here to launch the express setup for this computer**.This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
156+
1. For Settings, select **Click here to launch the express setup for this computer**. This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
157157
![Integration runtime setup](./media/tutorial-hybrid-copy-portal/intergration-runtime-setup.png)
158158

159-
1. In the **Integration Runtime (Self-hosted) Express Setup** window, select **Close**.
159+
1. In the **Integration Runtime (Self-hosted) Express Setup** window, select **Close** when the process is finished.
160160

161161
![Integration runtime (self-hosted) express setup](./media/tutorial-hybrid-copy-portal/integration-runtime-setup-successful.png)
162162

163-
1. In the **New Linked Service** dialog box, confirm that **TutorialIntegrationRuntime** is selected under **Connect via integration runtime**. Then, take the following steps:
163+
1. In the **New linked service (SQL Server)** dialog box, confirm that **TutorialIntegrationRuntime** is selected under **Connect via integration runtime**. Then, take the following steps:
164164

165165
a. Under **Name**, enter **SqlServerLinkedService**.
166166

@@ -174,13 +174,17 @@ In this step, you create a data factory and start the Data Factory UI to create
174174

175175
f. Select **Test connection**. This step is to confirm that Data Factory can connect to your SQL Server database by using the self-hosted integration runtime you created.
176176

177-
g. To save the linked service, select **Finish**.
177+
g. To save the linked service, select **Create**.
178+
179+
![New linked service (SQL Server)](./media/tutorial-hybrid-copy-portal/new-sqlserver-linked-service.png)
178180

179-
1. You should be back in the window with the source dataset opened. On the **Connection** tab of the **Properties** window, take the following steps:
181+
1. After the linked service is created, you're back to the **Set properties** page for the SqlServerDataset. Take the following steps:
180182

181183
a. In **Linked service**, confirm that you see **SqlServerLinkedService**.
182184

183-
b. In **Table**, select **[dbo].[emp]**.
185+
b. Under **Table name**, select **[dbo].[emp]**.
186+
187+
c. Select **OK**.
184188

185189
1. Go to the tab with **SQLServerToBlobPipeline**, or select **SQLServerToBlobPipeline** in the tree view.
186190

@@ -194,10 +198,11 @@ In this step, you create a data factory and start the Data Factory UI to create
194198

195199
1. In the **Set Properties** dialog box, enter **AzureBlobDataset** for Name. Next to the **Linked service** text box, select **+ New**.
196200

197-
1. In the **New Linked Service (Azure Blob Storage)** dialog box, enter **AzureStorageLinkedService** as name, select your storage account from the **Storage account** name list. Test connection, and then select **Finish** to deploy the linked service.
198-
1. After the linked service is created, you're back to the **Set properties** page. Select **Continue**.
201+
1. In the **New Linked Service (Azure Blob Storage)** dialog box, enter **AzureStorageLinkedService** as name, select your storage account from the **Storage account** name list. Test connection, and then select **Create** to deploy the linked service.
199202

200-
1. You should be back in the window with the sink dataset open. On the **Connection** tab, take the following steps:
203+
1. After the linked service is created, you're back to the **Set properties** page. Select **OK**.
204+
205+
1. Open the sink dataset. On the **Connection** tab, take the following steps:
201206

202207
a. In **Linked service**, confirm that **AzureStorageLinkedService** is selected.
203208

@@ -210,11 +215,13 @@ In this step, you create a data factory and start the Data Factory UI to create
210215

211216
1. Go to the tab with the pipeline opened, or select the pipeline in the tree view. In **Sink Dataset**, confirm that **AzureBlobDataset** is selected.
212217

213-
1. To validate the pipeline settings, select **Validate** on the toolbar for the pipeline. To close the **Pipe Validation Report**, select **Close**.
218+
1. To validate the pipeline settings, select **Validate** on the toolbar for the pipeline. To close the **Pipe validation output**, select the **>>** icon.
219+
![validate pipeline](./media/tutorial-hybrid-copy-portal/validate-pipeline.png)
220+
214221

215-
1. To publish entities you created to Data Factory, select **Publish All**.
222+
1. To publish entities you created to Data Factory, select **Publish all**.
216223

217-
1. Wait until you see the **Publishing succeeded** pop-up. To check the status of publishing, select the **Show Notifications** link on the top of the window. To close the notification window, select **Close**.
224+
1. Wait until you see the **Publishing completed** pop-up. To check the status of publishing, select the **Show notifications** link on the top of the window. To close the notification window, select **Close**.
218225

219226

220227
## Trigger a pipeline run
@@ -224,8 +231,10 @@ Select **Add Trigger** on the toolbar for the pipeline, and then select **Trigge
224231

225232
1. Go to the **Monitor** tab. You see the pipeline that you manually triggered in the previous step.
226233

234+
1. To view activity runs associated with the pipeline run, select the **SQLServerToBlobPipeline** link under *PIPELINE NAME*.
227235
![Monitor pipeline runs](./media/tutorial-hybrid-copy-portal/pipeline-runs.png)
228-
1. To view activity runs associated with the pipeline run, select the **View Activity Runs** link in the **Actions** column. You see only activity runs because there's only one activity in the pipeline. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To go back to the Pipeline Runs view, select **Pipeline Runs** at the top.
236+
237+
1. On the **Activity runs** page, select the Details (eyeglasses image) link to see details about the copy operation. To go back to the Pipeline Runs view, select **All pipeline runs** at the top.
229238

230239
## Verify the output
231240
The pipeline automatically creates the output folder named *fromonprem* in the `adftutorial` blob container. Confirm that you see the *[pipeline().RunId].txt* file in the output folder.

0 commit comments

Comments
 (0)