Skip to content

Commit 1ee616d

Browse files
authored
Merge pull request #107455 from Samantha-Yu/adfupdate0311-3
Refreshed according to UI changes
2 parents 6043e74 + 0a3418f commit 1ee616d

File tree

9 files changed

+32
-23
lines changed

9 files changed

+32
-23
lines changed
-19.4 KB
Loading
-10.6 KB
Loading
34.2 KB
Loading
75.3 KB
Loading
68.4 KB
Loading
-23.3 KB
Loading
19.8 KB
Loading
51.1 KB
Loading

articles/data-factory/tutorial-hybrid-copy-portal.md

Lines changed: 32 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: tutorial
1212
ms.custom: seo-lt-2019; seo-dt-2019
13-
ms.date: 01/11/2018
13+
ms.date: 03/12/2020
1414
---
1515

1616
# Copy data from an on-premises SQL Server database to Azure Blob storage
@@ -86,17 +86,17 @@ You use the name and key of your storage account in this tutorial. To get the na
8686
#### Create the adftutorial container
8787
In this section, you create a blob container named **adftutorial** in your Blob storage.
8888

89-
1. In the **Storage account** window, go to **Overview**, and then select **Blobs**.
89+
1. In the **Storage account** window, go to **Overview**, and then select **Containers**.
9090

9191
![Select Blobs option](media/tutorial-hybrid-copy-powershell/select-blobs.png)
9292

93-
1. In the **Blob service** window, select **Container**.
93+
1. In the **Containers** window, select **+ Container** to create a new one.
9494

95-
1. In the **New container** window, under **Name**, enter **adftutorial**. Then select **OK**.
95+
1. In the **New container** window, under **Name**, enter **adftutorial**. Then select **Create**.
9696

97-
1. In the list of containers, select **adftutorial**.
97+
1. In the list of containers, select **adftutorial** you just created.
9898

99-
1. Keep the **container** window for **adftutorial** open. You use it verify the output at the end of the tutorial. Data Factory automatically creates the output folder in this container, so you don't need to create one.
99+
1. Keep the **container** window for **adftutorial** open. You use it to verify the output at the end of the tutorial. Data Factory automatically creates the output folder in this container, so you don't need to create one.
100100

101101
## Create a data factory
102102
In this step, you create a data factory and start the Data Factory UI to create a pipeline in the data factory.
@@ -138,29 +138,29 @@ In this step, you create a data factory and start the Data Factory UI to create
138138

139139
1. On the **General** tab at the bottom of the **Properties** window, for **Name**, enter **SQLServerToBlobPipeline**.
140140

141-
![Pipeline name](./media/tutorial-hybrid-copy-portal/pipeline-name.png)
142-
143141
1. In the **Activities** tool box, expand **Move & Transform**. Drag and drop the **Copy** activity to the pipeline design surface. Set the name of the activity to **CopySqlServerToAzureBlobActivity**.
144142

145143
1. In the **Properties** window, go to the **Source** tab, and select **+ New**.
146144

147145
1. In the **New Dataset** dialog box, search for **SQL Server**. Select **SQL Server**, and then select **Continue**.
146+
![New SqlServer dataset](./media/tutorial-hybrid-copy-portal/create-sqlserver-dataset.png)
148147

149148
1. In the **Set Properties** dialog box, under **Name**, enter **SqlServerDataset**. Under **Linked service**, select **+ New**. You create a connection to the source data store (SQL Server database) in this step.
150149

151150
1. In the **New Linked Service** dialog box, add **Name** as **SqlServerLinkedService**. Under **Connect via integration runtime**, select **+New**. In this section, you create a self-hosted integration runtime and associate it with an on-premises machine with the SQL Server database. The self-hosted integration runtime is the component that copies data from the SQL Server database on your machine to Blob storage.
152151

153-
1. In the **Integration Runtime Setup** dialog box, select **Self-Hosted**, and then select **Next**.
152+
1. In the **Integration Runtime Setup** dialog box, select **Self-Hosted**, and then select **Continue**.
154153

155-
1. Under name, enter **TutorialIntegrationRuntime**. Then select **Next**.
154+
1. Under name, enter **TutorialIntegrationRuntime**. Then select **Create**.
156155

157-
1. For Settings, select **Click here to launch the express setup for this computer**.This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
156+
1. For Settings, select **Click here to launch the express setup for this computer**. This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
157+
![Integration runtime setup](./media/tutorial-hybrid-copy-portal/intergration-runtime-setup.png)
158158

159-
1. In the **Integration Runtime (Self-hosted) Express Setup** window, select **Close**.
159+
1. In the **Integration Runtime (Self-hosted) Express Setup** window, select **Close** when the process is finished.
160160

161161
![Integration runtime (self-hosted) express setup](./media/tutorial-hybrid-copy-portal/integration-runtime-setup-successful.png)
162162

163-
1. In the **New Linked Service** dialog box, confirm that **TutorialIntegrationRuntime** is selected under **Connect via integration runtime**. Then, take the following steps:
163+
1. In the **New linked service (SQL Server)** dialog box, confirm that **TutorialIntegrationRuntime** is selected under **Connect via integration runtime**. Then, take the following steps:
164164

165165
a. Under **Name**, enter **SqlServerLinkedService**.
166166

@@ -174,13 +174,17 @@ In this step, you create a data factory and start the Data Factory UI to create
174174

175175
f. Select **Test connection**. This step is to confirm that Data Factory can connect to your SQL Server database by using the self-hosted integration runtime you created.
176176

177-
g. To save the linked service, select **Finish**.
177+
g. To save the linked service, select **Create**.
178+
179+
![New linked service (SQL Server)](./media/tutorial-hybrid-copy-portal/new-sqlserver-linked-service.png)
178180

179-
1. You should be back in the window with the source dataset opened. On the **Connection** tab of the **Properties** window, take the following steps:
181+
1. After the linked service is created, you're back to the **Set properties** page for the SqlServerDataset. Take the following steps:
180182

181183
a. In **Linked service**, confirm that you see **SqlServerLinkedService**.
182184

183-
b. In **Table**, select **[dbo].[emp]**.
185+
b. Under **Table name**, select **[dbo].[emp]**.
186+
187+
c. Select **OK**.
184188

185189
1. Go to the tab with **SQLServerToBlobPipeline**, or select **SQLServerToBlobPipeline** in the tree view.
186190

@@ -194,10 +198,11 @@ In this step, you create a data factory and start the Data Factory UI to create
194198

195199
1. In the **Set Properties** dialog box, enter **AzureBlobDataset** for Name. Next to the **Linked service** text box, select **+ New**.
196200

197-
1. In the **New Linked Service (Azure Blob Storage)** dialog box, enter **AzureStorageLinkedService** as name, select your storage account from the **Storage account** name list. Test connection, and then select **Finish** to deploy the linked service.
198-
1. After the linked service is created, you're back to the **Set properties** page. Select **Continue**.
201+
1. In the **New Linked Service (Azure Blob Storage)** dialog box, enter **AzureStorageLinkedService** as name, select your storage account from the **Storage account** name list. Test connection, and then select **Create** to deploy the linked service.
202+
203+
1. After the linked service is created, you're back to the **Set properties** page. Select **OK**.
199204

200-
1. You should be back in the window with the sink dataset open. On the **Connection** tab, take the following steps:
205+
1. Open the sink dataset. On the **Connection** tab, take the following steps:
201206

202207
a. In **Linked service**, confirm that **AzureStorageLinkedService** is selected.
203208

@@ -210,11 +215,13 @@ In this step, you create a data factory and start the Data Factory UI to create
210215

211216
1. Go to the tab with the pipeline opened, or select the pipeline in the tree view. In **Sink Dataset**, confirm that **AzureBlobDataset** is selected.
212217

213-
1. To validate the pipeline settings, select **Validate** on the toolbar for the pipeline. To close the **Pipe Validation Report**, select **Close**.
218+
1. To validate the pipeline settings, select **Validate** on the toolbar for the pipeline. To close the **Pipe validation output**, select the **>>** icon.
219+
![validate pipeline](./media/tutorial-hybrid-copy-portal/validate-pipeline.png)
220+
214221

215-
1. To publish entities you created to Data Factory, select **Publish All**.
222+
1. To publish entities you created to Data Factory, select **Publish all**.
216223

217-
1. Wait until you see the **Publishing succeeded** pop-up. To check the status of publishing, select the **Show Notifications** link on the top of the window. To close the notification window, select **Close**.
224+
1. Wait until you see the **Publishing completed** pop-up. To check the status of publishing, select the **Show notifications** link on the top of the window. To close the notification window, select **Close**.
218225

219226

220227
## Trigger a pipeline run
@@ -224,8 +231,10 @@ Select **Add Trigger** on the toolbar for the pipeline, and then select **Trigge
224231

225232
1. Go to the **Monitor** tab. You see the pipeline that you manually triggered in the previous step.
226233

234+
1. To view activity runs associated with the pipeline run, select the **SQLServerToBlobPipeline** link under *PIPELINE NAME*.
227235
![Monitor pipeline runs](./media/tutorial-hybrid-copy-portal/pipeline-runs.png)
228-
1. To view activity runs associated with the pipeline run, select the **View Activity Runs** link in the **Actions** column. You see only activity runs because there's only one activity in the pipeline. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To go back to the Pipeline Runs view, select **Pipeline Runs** at the top.
236+
237+
1. On the **Activity runs** page, select the Details (eyeglasses image) link to see details about the copy operation. To go back to the Pipeline Runs view, select **All pipeline runs** at the top.
229238

230239
## Verify the output
231240
The pipeline automatically creates the output folder named *fromonprem* in the `adftutorial` blob container. Confirm that you see the *[pipeline().RunId].txt* file in the output folder.

0 commit comments

Comments
 (0)