Skip to content

Commit 28c078a

Browse files
committed
Addressed feedbacks & updated UI changes
1 parent 41c0054 commit 28c078a

16 files changed

+31
-40
lines changed

articles/data-factory/load-azure-sql-data-warehouse.md

Lines changed: 31 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
1212
ms.custom: seo-lt-2019
13-
ms.date: 06/22/2018
13+
ms.date: 04/16/2020
1414
---
1515

1616
# Load data into Azure SQL Data Warehouse by using Azure Data Factory
@@ -42,13 +42,9 @@ This article shows you how to use the Data Factory Copy Data tool to _load data
4242

4343
1. On the left menu, select **Create a resource** > **Data + Analytics** > **Data Factory**:
4444

45-
![Data Factory selection in the "New" pane](./media/quickstart-create-data-factory-portal/new-azure-data-factory-menu.png)
45+
2. On the **New data factory** page, provide values for following items:
4646

47-
2. In the **New data factory** page, provide values for the fields that are shown in the following image:
48-
49-
![New data factory page](./media/load-azure-sql-data-warehouse/new-azure-data-factory.png)
50-
51-
* **Name**: Enter a globally unique name for your Azure data factory. If you receive the error "Data factory name \"LoadSQLDWDemo\" is not available," enter a different name for the data factory. For example, you could use the name _**yourname**_**ADFTutorialDataFactory**. Try creating the data factory again. For the naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
47+
* **Name**: Enter *LoadSQLDWDemo* for name. The name for your data factory must be *globally unique. If you receive the error "Data factory name 'LoadSQLDWDemo' is not available", enter a different name for the data factory. For example, you could use the name _**yourname**_**ADFTutorialDataFactory**. Try creating the data factory again. For the naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
5248
* **Subscription**: Select your Azure subscription in which to create the data factory.
5349
* **Resource Group**: Select an existing resource group from the drop-down list, or select the **Create new** option and enter the name of a resource group. To learn about resource groups, see [Using resource groups to manage your Azure resources](../azure-resource-manager/management/overview.md).
5450
* **Version**: Select **V2**.
@@ -57,100 +53,95 @@ This article shows you how to use the Data Factory Copy Data tool to _load data
5753
3. Select **Create**.
5854
4. After creation is complete, go to your data factory. You see the **Data Factory** home page as shown in the following image:
5955

60-
![Data factory home page](./media/load-azure-sql-data-warehouse/data-factory-home-page.png)
56+
![Data factory home page](./media/doc-common-process/data-factory-home-page.png)
6157

6258
Select the **Author & Monitor** tile to launch the Data Integration Application in a separate tab.
6359

6460
## Load data into Azure SQL Data Warehouse
6561

66-
1. In the **Get started** page, select the **Copy Data** tile to launch the Copy Data tool:
62+
1. In the **Get started** page, select the **Copy Data** tile to launch the Copy Data tool.
6763

68-
![Copy Data tool tile](./media/load-azure-sql-data-warehouse/copy-data-tool-tile.png)
69-
1. In the **Properties** page, specify **CopyFromSQLToSQLDW** for the **Task name** field, and select **Next**:
64+
1. In the **Properties** page, specify **CopyFromSQLToSQLDW** for the **Task name** field, and select **Next**.
7065

7166
![Properties page](./media/load-azure-sql-data-warehouse/copy-data-tool-properties-page.png)
7267

7368
1. In the **Source data store** page, complete the following steps:
69+
>[!TIP]
70+
>In this tutorial, you use *SQL authentication* as the authentication type for your source data store, but you can choose other supported authentication methods:*Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](https://docs.microsoft.com/azure/data-factory/connector-azure-sql-database#linked-service-properties) for details.
71+
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](https://docs.microsoft.com/azure/data-factory/store-credentials-in-key-vault) for detailed illustrations.
7472
75-
a. click **+ Create new connection**:
76-
77-
![Source data store page](./media/load-azure-sql-data-warehouse/new-source-linked-service.png)
73+
a. click **+ Create new connection**.
7874

7975
b. Select **Azure SQL Database** from the gallery, and select **Continue**. You can type "SQL" in the search box to filter the connectors.
8076

8177
![Select Azure SQL DB](./media/load-azure-sql-data-warehouse/select-azure-sql-db-source.png)
8278

83-
c. In the **New Linked Service** page, select your server name and DB name from the dropdown list, and specify the username and password. Click **Test connection** to validate the settings, then select **Finish**.
79+
c. In the **New Linked Service** page, select your server name and DB name from the dropdown list, and specify the username and password. Click **Test connection** to validate the settings, then select **Create**.
8480

8581
![Configure Azure SQL DB](./media/load-azure-sql-data-warehouse/configure-azure-sql-db.png)
8682

8783
d. Select the newly created linked service as source, then click **Next**.
8884

89-
![Select source linked service](./media/load-azure-sql-data-warehouse/select-source-linked-service.png)
90-
91-
1. In the **Select tables from which to copy the data or use a custom query** page, enter **SalesLT** to filter the tables. Choose the **(Select all)** box to use all of the tables for the copy, and then select **Next**:
85+
1. In the **Select tables from which to copy the data or use a custom query** page, enter **SalesLT** to filter the tables. Choose the **(Select all)** box to use all of the tables for the copy, and then select **Next**.
9286

9387
![Select source tables](./media/load-azure-sql-data-warehouse/select-source-tables.png)
9488

89+
1. In the **Apply filter** page, specify your settings or select **Next**.
90+
9591
1. In the **Destination data store** page, complete the following steps:
92+
>[!TIP]
93+
>In this tutorial, you use *SQL authentication* as the authentication type for your destination data store, but you can choose other supported authentication methods:*Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](https://docs.microsoft.com/azure/data-factory/connector-azure-sql-data-warehouse#linked-service-properties) for details.
94+
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](https://docs.microsoft.com/azure/data-factory/store-credentials-in-key-vault) for detailed illustrations.
9695
9796
a. Click **+ Create new connection** to add a connection
9897

99-
![Sink data store page](./media/load-azure-sql-data-warehouse/new-sink-linked-service.png)
100-
101-
b. Select **Azure SQL Data Warehouse** from the gallery, and select **Next**.
98+
b. Select **Azure Synapse Analytics (formerly SQL DW)** from the gallery, and select **Continue**. You can type "SQL" in the search box to filter the connectors.
10299

103100
![Select Azure SQL DW](./media/load-azure-sql-data-warehouse/select-azure-sql-dw-sink.png)
104101

105-
c. In the **New Linked Service** page, select your server name and DB name from the dropdown list, and specify the username and password. Click **Test connection** to validate the settings, then select **Finish**.
102+
c. In the **New Linked Service** page, select your server name and DB name from the dropdown list, and specify the username and password. Click **Test connection** to validate the settings, then select **Create**.
106103

107104
![Configure Azure SQL DW](./media/load-azure-sql-data-warehouse/configure-azure-sql-dw.png)
108105

109106
d. Select the newly created linked service as sink, then click **Next**.
110107

111-
![Select sink linked service](./media/load-azure-sql-data-warehouse/select-sink-linked-service.png)
112-
113108
1. In the **Table mapping** page, review the content, and select **Next**. An intelligent table mapping displays. The source tables are mapped to the destination tables based on the table names. If a source table doesn't exist in the destination, Azure Data Factory creates a destination table with the same name by default. You can also map a source table to an existing destination table.
114109

115110
> [!NOTE]
116111
> Automatic table creation for the SQL Data Warehouse sink applies when SQL Server or Azure SQL Database is the source. If you copy data from another source data store, you need to pre-create the schema in the sink Azure SQL Data Warehouse before executing the data copy.
117112
118113
![Table mapping page](./media/load-azure-sql-data-warehouse/table-mapping.png)
119114

120-
1. In the **Schema mapping** page, review the content, and select **Next**. The intelligent table mapping is based on the column name. If you let Data Factory automatically create the tables, data type conversion can occur when there are incompatibilities between the source and destination stores. If there's an unsupported data type conversion between the source and destination column, you see an error message next to the corresponding table.
115+
1. In the **Column mapping** page, review the content, and select **Next**. The intelligent table mapping is based on the column name. If you let Data Factory automatically create the tables, data type conversion can occur when there are incompatibilities between the source and destination stores. If there's an unsupported data type conversion between the source and destination column, you see an error message next to the corresponding table.
121116

122-
![Schema mapping page](./media/load-azure-sql-data-warehouse/schema-mapping.png)
117+
![Column mapping page](./media/load-azure-sql-data-warehouse/schema-mapping.png)
123118

124119
1. In the **Settings** page, complete the following steps:
125120

126-
a. In **Staging settings** section, click **+ New** to new a staging storage. The storage is used for staging the data before it loads into SQL Data Warehouse by using PolyBase. After the copy is complete, the interim data in Azure Storage is automatically cleaned up.
127-
128-
![Configure staging](./media/load-azure-sql-data-warehouse/configure-staging.png)
129-
130-
b. In the **New Linked Service** page, select your storage account, and select **Finish**.
121+
a. In **Staging settings** section, click **+ New** to new a staging storage. The storage is used for staging the data before it loads into SQL Data Warehouse by using PolyBase. After the copy is complete, the interim data in Azure Blob Storage is automatically cleaned up.
131122

132-
![Configure Azure Storage](./media/load-azure-sql-data-warehouse/configure-blob-storage.png)
123+
b. In the **New Linked Service** page, select your storage account, and select **Create** to deploy the linked service.
133124

134125
c. In the **Advanced settings** section, deselect the **Use type default** option, then select **Next**.
135126

136127
![Configure PolyBase](./media/load-azure-sql-data-warehouse/configure-polybase.png)
137128

138-
1. In the **Summary** page, review the settings, and select **Next**:
129+
1. In the **Summary** page, review the settings, and select **Next**.
139130

140131
![Summary page](./media/load-azure-sql-data-warehouse/summary-page.png)
141-
1. In the **Deployment page**, select **Monitor** to monitor the pipeline (task):
132+
1. In the **Deployment page**, select **Monitor** to monitor the pipeline (task).
142133

143-
![Deployment page](./media/load-azure-sql-data-warehouse/deployment-page.png)
144-
1. Notice that the **Monitor** tab on the left is automatically selected. The **Actions** column includes links to view activity run details and to rerun the pipeline:
134+
1. Notice that the **Monitor** tab on the left is automatically selected. When the pipeline run completes successfully, select the **CopyFromSQLToSQLDW** link under the **PIPELINE NAME** column to view activity run details and to rerun the pipeline.
145135

146-
![Monitor pipeline runs](./media/load-azure-sql-data-warehouse/pipeline-monitoring.png)
147-
1. To view activity runs that are associated with the pipeline run, select the **View Activity Runs** link in the **Actions** column. To switch back to the pipeline runs view, select the **Pipelines** link at the top. Select **Refresh** to refresh the list.
136+
[![Monitor pipeline runs](./media/load-azure-sql-data-warehouse/pipeline-monitoring.png)](./media/load-azure-sql-data-warehouse/pipeline-monitoring.png#lightbox)
137+
1. To switch back to the pipeline runs view, select the **All pipeline runs** link at the top. Select **Refresh** to refresh the list.
148138

149139
![Monitor activity runs](./media/load-azure-sql-data-warehouse/activity-monitoring.png)
150140

151-
1. To monitor the execution details for each copy activity, select the **Details** link under **Actions** in the activity monitoring view. You can monitor details like the volume of data copied from the source to the sink, data throughput, execution steps with corresponding duration, and used configurations:
141+
1. To monitor the execution details for each copy activity, select the **Details** link (eyeglasses icon) under **ACTIVITY NAME** in the activity runs view. You can monitor details like the volume of data copied from the source to the sink, data throughput, execution steps with corresponding duration, and used configurations.
142+
![Monitor activity run details](./media/load-azure-sql-data-warehouse/monitor-activity-run-details1.png)
152143

153-
![Monitor activity run details](./media/load-azure-sql-data-warehouse/monitor-activity-run-details.png)
144+
![Monitor activity run details](./media/load-azure-sql-data-warehouse/monitor-activity-run-details2.png)
154145

155146
## Next steps
156147

20.3 KB
Loading
37.7 KB
Loading
10 KB
Loading
3.45 KB
Loading
31.9 KB
Loading
66.7 KB
Loading
106 KB
Loading
5.14 KB
Loading

0 commit comments

Comments
 (0)