|
1 | 1 | ---
|
2 |
| -title: Copy and transform data in Microsoft Fabric Lakehouse (Preview) |
| 2 | +title: Copy and transform data in Microsoft Fabric Lakehouse |
3 | 3 | titleSuffix: Azure Data Factory & Azure Synapse
|
4 |
| -description: Learn how to copy and transform data to and from Microsoft Fabric Lakehouse (Preview) using Azure Data Factory or Azure Synapse Analytics pipelines. |
| 4 | +description: Learn how to copy and transform data in Microsoft Fabric Lakehouse using Azure Data Factory or Azure Synapse Analytics pipelines. |
5 | 5 | ms.author: jianleishen
|
6 | 6 | author: jianleishen
|
7 | 7 | ms.service: data-factory
|
8 | 8 | ms.subservice: data-movement
|
9 | 9 | ms.topic: conceptual
|
10 | 10 | ms.custom: synapse
|
11 |
| -ms.date: 12/08/2023 |
| 11 | +ms.date: 01/08/2024 |
12 | 12 | ---
|
13 | 13 |
|
14 |
| -# Copy and transform data in Microsoft Fabric Lakehouse (Preview) using Azure Data Factory or Azure Synapse Analytics |
| 14 | +# Copy and transform data in Microsoft Fabric Lakehouse using Azure Data Factory or Azure Synapse Analytics |
15 | 15 |
|
16 | 16 | [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
|
17 | 17 |
|
18 | 18 | Microsoft Fabric Lakehouse is a data architecture platform for storing, managing, and analyzing structured and unstructured data in a single location. In order to achieve seamless data access across all compute engines in Microsoft Fabric, go to [Lakehouse and Delta Tables](/fabric/data-engineering/lakehouse-and-delta-tables) to learn more.
|
19 | 19 |
|
20 |
| -This article outlines how to use Copy activity to copy data from and to Microsoft Fabric Lakehouse (Preview) and use Data Flow to transform data in Microsoft Fabric Lakehouse (Preview). To learn more, read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md). |
| 20 | +This article outlines how to use Copy activity to copy data from and to Microsoft Fabric Lakehouse and use Data Flow to transform data in Microsoft Fabric Lakehouse. To learn more, read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md). |
21 | 21 |
|
22 |
| -> [!IMPORTANT] |
23 |
| -> This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact [Azure support](https://azure.microsoft.com/support/). |
24 | 22 |
|
25 | 23 | ## Supported capabilities
|
26 | 24 |
|
27 | 25 | This Microsoft Fabric Lakehouse connector is supported for the following capabilities:
|
28 | 26 |
|
29 |
| -| Supported capabilities|IR | Managed private endpoint| |
30 |
| -|---------| --------| --------| |
31 |
| -|[Copy activity](copy-activity-overview.md) (source/sink)|① ②|✓ | |
32 |
| -|[Mapping data flow](concepts-data-flow-overview.md) (source/sink)|① |- | |
| 27 | +| Supported capabilities|IR | |
| 28 | +|---------| --------| |
| 29 | +|[Copy activity](copy-activity-overview.md) (source/sink)|① ②| |
| 30 | +|[Mapping data flow](concepts-data-flow-overview.md) (source/sink)|① | |
33 | 31 |
|
34 | 32 | *① Azure integration runtime ② Self-hosted integration runtime*
|
35 | 33 |
|
@@ -73,10 +71,10 @@ The Microsoft Fabric Lakehouse connector supports the following authentication t
|
73 | 71 |
|
74 | 72 | To use service principal authentication, follow these steps.
|
75 | 73 |
|
76 |
| -1. Register an application with the Microsoft Identity platform. To learn how, see [Quickstart: Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md). Make note of these values, which you use to define the linked service: |
| 74 | +1. [Register an application with the Microsoft Identity platform](../active-directory/develop/quickstart-register-app.md) and [add a client secret](../active-directory/develop/quickstart-register-app.md#add-a-client-secret). Afterwards, make note of these values, which you use to define the linked service: |
77 | 75 |
|
78 |
| - - Application ID |
79 |
| - - Application key |
| 76 | + - Application (client) ID, which is the service principal ID in the linked service. |
| 77 | + - Client secret value, which is the service principal key in the linked service. |
80 | 78 | - Tenant ID
|
81 | 79 |
|
82 | 80 | 2. Grant the service principal at least the **Contributor** role in Microsoft Fabric workspace. Follow these steps:
|
@@ -104,7 +102,7 @@ These properties are supported for the linked service:
|
104 | 102 | | tenant | Specify the tenant information (domain name or tenant ID) under which your application resides. Retrieve it by hovering the mouse in the upper-right corner of the Azure portal. | Yes |
|
105 | 103 | | servicePrincipalId | Specify the application's client ID. | Yes |
|
106 | 104 | | servicePrincipalCredentialType | The credential type to use for service principal authentication. Allowed values are **ServicePrincipalKey** and **ServicePrincipalCert**. | Yes |
|
107 |
| -| servicePrincipalCredential | The service principal credential. <br/> When you use **ServicePrincipalKey** as the credential type, specify the application's key. Mark this field as **SecureString** to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). <br/> When you use **ServicePrincipalCert** as the credential, reference a certificate in Azure Key Vault, and ensure the certificate content type is **PKCS #12**.| Yes | |
| 105 | +| servicePrincipalCredential | The service principal credential. <br/> When you use **ServicePrincipalKey** as the credential type, specify the application's client secret value. Mark this field as **SecureString** to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). <br/> When you use **ServicePrincipalCert** as the credential, reference a certificate in Azure Key Vault, and ensure the certificate content type is **PKCS #12**.| Yes | |
108 | 106 | | connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or a self-hosted integration runtime if your data store is in a private network. If not specified, the default Azure integration runtime is used. |No |
|
109 | 107 |
|
110 | 108 | **Example: using service principal key authentication**
|
@@ -397,7 +395,7 @@ Assuming you have the following source folder structure and want to copy the fil
|
397 | 395 |
|
398 | 396 | | Sample source structure | Content in FileListToCopy.txt | ADF configuration |
|
399 | 397 | | ------------------------------------------------------------ | --------------------------------------------------------- | ------------------------------------------------------------ |
|
400 |
| -| filesystem<br/> FolderA<br/> **File1.csv**<br/> File2.json<br/> Subfolder1<br/> **File3.csv**<br/> File4.json<br/> **File5.csv**<br/> Metadata<br/> FileListToCopy.txt | File1.csv<br>Subfolder1/File3.csv<br>Subfolder1/File5.csv | **In dataset:**<br>- File system: `filesystem`<br>- Folder path: `FolderA`<br><br>**In copy activity source:**<br>- File list path: `filesystem/Metadata/FileListToCopy.txt` <br><br>The file list path points to a text file in the same data store that includes a list of files you want to copy, one file per line with the relative path to the path configured in the dataset. | |
| 398 | +| filesystem<br/> FolderA<br/> **File1.csv**<br/> File2.json<br/> Subfolder1<br/> **File3.csv**<br/> File4.json<br/> **File5.csv**<br/> Metadata<br/> FileListToCopy.txt | File1.csv<br>Subfolder1/File3.csv<br>Subfolder1/File5.csv | **In dataset:**<br>- Folder path: `FolderA`<br><br>**In copy activity source:**<br>- File list path: `Metadata/FileListToCopy.txt` <br><br>The file list path points to a text file in the same data store that includes a list of files you want to copy, one file per line with the relative path to the path configured in the dataset. | |
401 | 399 |
|
402 | 400 |
|
403 | 401 | #### Some recursive and copyBehavior examples
|
|
0 commit comments