You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-sql-data-warehouse.md
+7-8Lines changed: 7 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ author: jianleishen
7
7
ms.subservice: data-movement
8
8
ms.custom: synapse
9
9
ms.topic: conceptual
10
-
ms.date: 06/17/2024
10
+
ms.date: 07/23/2024
11
11
---
12
12
13
13
# Copy and transform data in Azure Synapse Analytics by using Azure Data Factory or Synapse pipelines
@@ -663,16 +663,16 @@ Using [COPY statement](/sql/t-sql/statements/copy-into-transact-sql) is a simple
663
663
664
664
### Direct copy by using COPY statement
665
665
666
-
Azure Synapse Analytics COPY statement directly supports Azure Blob, Azure Data Lake Storage Gen1and Azure Data Lake Storage Gen2. If your source data meets the criteria described in this section, use COPY statement to copy directly from the source data store to Azure Synapse Analytics. Otherwise, use [Staged copy by using COPY statement](#staged-copy-by-using-copy-statement). The service checks the settings and fails the copy activity run if the criteria is not met.
666
+
Azure Synapse Analytics COPY statement directly supports Azure Blob and Azure Data Lake Storage Gen2. If your source data meets the criteria described in this section, use COPY statement to copy directly from the source data store to Azure Synapse Analytics. Otherwise, use [Staged copy by using COPY statement](#staged-copy-by-using-copy-statement). The service checks the settings and fails the copy activity run if the criteria is not met.
667
667
668
668
1. The **source linked service and format** are with the following types and authentication methods:
669
669
670
670
| Supported source data store type | Supported format | Supported source authentication type |
| [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md) | [Delimited text](format-delimited-text.md)<br/>[Parquet](format-parquet.md)<br/>[ORC](format-orc.md) | Account key authentication, service principal authentication, shared access signature authentication, system-assigned managed identity authentication |
675
+
| [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md) | [Delimited text](format-delimited-text.md)<br/>[Parquet](format-parquet.md)<br/>[ORC](format-orc.md) | Account key authentication, service principal authentication (using ServicePrincipalKey), shared access signature authentication, system-assigned managed identity authentication |
676
676
677
677
>[!IMPORTANT]
678
678
>- When you use managed identity authentication for your storage linked service, learn the needed configurations for [Azure Blob](connector-azure-blob-storage.md#managed-identity) and [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md#managed-identity) respectively.
@@ -801,7 +801,7 @@ To use this feature, create an [Azure Blob Storage linked service](connector-azu
801
801
802
802
Using [PolyBase](/sql/relational-databases/polybase/polybase-guide) is an efficient way to load a large amount of data into Azure Synapse Analytics with high throughput. You'll see a large gain in the throughput by using PolyBase instead of the default BULKINSERT mechanism.
803
803
804
-
- If your source data is in**Azure Blob, Azure Data Lake Storage Gen1or Azure Data Lake Storage Gen2**, and the **format is PolyBase compatible**, you can use copy activity to directly invoke PolyBase to let Azure Synapse Analytics pull the data from source. For details, see **[Direct copy by using PolyBase](#direct-copy-by-using-polybase)**.
804
+
- If your source data is in**Azure Blob or Azure Data Lake Storage Gen2**, and the **format is PolyBase compatible**, you can use copy activity to directly invoke PolyBase to let Azure Synapse Analytics pull the data from source. For details, see **[Direct copy by using PolyBase](#direct-copy-by-using-polybase)**.
805
805
- If your source data store and format isn't originally supported by PolyBase, use the **[Staged copy by using PolyBase](#staged-copy-by-using-polybase)** feature instead. The staged copy feature also provides you better throughput. It automatically converts the data into PolyBase-compatible format, stores the data in Azure Blob storage, then calls PolyBase to load data into Azure Synapse Analytics.
806
806
807
807
> [!TIP]
@@ -818,7 +818,7 @@ The following PolyBase settings are supported under `polyBaseSettings` in copy a
818
818
819
819
### Direct copy by using PolyBase
820
820
821
-
Azure Synapse Analytics PolyBase directly supports Azure Blob, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2. If your source data meets the criteria described in this section, use PolyBase to copy directly from the source data store to Azure Synapse Analytics. Otherwise, use [Staged copy by using PolyBase](#staged-copy-by-using-polybase).
821
+
Azure Synapse Analytics PolyBase directly supports Azure Blob and Azure Data Lake Storage Gen2. If your source data meets the criteria described in this section, use PolyBase to copy directly from the source data store to Azure Synapse Analytics. Otherwise, use [Staged copy by using PolyBase](#staged-copy-by-using-polybase).
822
822
823
823
> [!TIP]
824
824
> To copy data efficiently to Azure Synapse Analytics, learn more from [Azure Data Factory makes it even easier and convenient to uncover insights from data when using Data Lake Store with Azure Synapse Analytics](/archive/blogs/azuredatalake/azure-data-factory-makes-it-even-easier-and-convenient-to-uncover-insights-from-data-when-using-data-lake-store-with-sql-data-warehouse).
@@ -829,8 +829,7 @@ If the requirements aren't met, the service checks the settings and automaticall
829
829
830
830
| Supported source data store type | Supported source authentication type |
0 commit comments