Skip to content

Commit 24373eb

Browse files
authored
Update copy-into-transact-sql.md
1 parent 610e09d commit 24373eb

File tree

1 file changed

+18
-3
lines changed

1 file changed

+18
-3
lines changed

docs/t-sql/statements/copy-into-transact-sql.md

Lines changed: 18 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -549,7 +549,7 @@ Follow these steps to work around this issue by re-registering the workspace's m
549549

550550
This article explains how to use the COPY statement in [!INCLUDE [fabricdw](../../includes/fabric-dw.md)] in [!INCLUDE [fabric](../../includes/fabric.md)] for loading from external storage accounts. The COPY statement provides the most flexibility for high-throughput data ingestion into your [!INCLUDE [fabricdw](../../includes/fabric-dw.md)], and is as strategy to [Ingest data into your [!INCLUDE [fabricdw](../../includes/fabric-dw.md)]](/fabric/data-warehouse/ingest-data).
551551

552-
In [!INCLUDE [fabric](../../includes/fabric.md)], the [COPY (Transact-SQL)](/sql/t-sql/statements/copy-into-transact-sql?view=fabric&preserve-view=true) statement currently supports the PARQUET and CSV file formats. For data sources, only Azure Data Lake Storage Gen2 accounts are supported.
552+
In [!INCLUDE [fabric](../../includes/fabric.md)], the [COPY (Transact-SQL)](/sql/t-sql/statements/copy-into-transact-sql?view=fabric&preserve-view=true) statement currently supports the PARQUET and CSV file formats. For data sources, Azure Data Lake Storage Gen2 accounts and OneLake sources are supported.
553553

554554
For more information on using COPY INTO on your [!INCLUDE [fabricdw](../../includes/fabric-dw.md)] in [!INCLUDE [fabric](../../includes/fabric.md)], see [Ingest data into your [!INCLUDE [fabricdw](../../includes/fabric-dw.md)] using the COPY statement](/fabric/data-warehouse/ingest-data-copy).
555555

@@ -631,10 +631,11 @@ When a column list isn't specified, COPY maps columns based on the source and ta
631631

632632
#### *External location*
633633

634-
Specifies where the files containing the data is staged. Currently Azure Data Lake Storage (ADLS) Gen2 and Azure Blob Storage are supported:
634+
Specifies where the files containing the data is staged. Currently Azure Data Lake Storage (ADLS) Gen2, Azure Blob Storage and OneLake (Preview) are supported:
635635

636636
- *External location* for Blob Storage: `https://<account\>.blob.core.windows.net/<container\>/<path\>`
637637
- *External location* for ADLS Gen2: `https://<account\>.dfs.core.windows.net/<container\>/<path\>`
638+
- *External location* for OneLake (Preview): `'https://onelake.dfs.fabric.microsoft.com/<workspaceId>/<lakehouseId>/Files/'`
638639

639640
Azure Data Lake Storage (ADLS) Gen2 offers better performance than Azure Blob Storage (legacy). Consider using an ADLS Gen2 account whenever possible.
640641

@@ -692,6 +693,9 @@ To access files on Azure Data Lake Storage (ADLS) Gen2 and Azure Blob Storage lo
692693
- *IDENTITY: A constant with a value of 'Storage Account Key'*
693694
- *SECRET: Storage account key*
694695

696+
> [!NOTE]
697+
> COPY INTO using OneLake as source only supports EntraID authentication.
698+
695699
#### *ERRORFILE = Directory Location*
696700

697701
*ERRORFILE* only applies to CSV. Specifies the directory where the rejected rows and the corresponding error file should be written. The full path from the storage account can be specified or the path relative to the container can be specified. If the specified path doesn't exist, one is created on your behalf. A child directory is created with the name "\_rejectedrows". The "\_" character ensures that the directory is escaped for other data processing unless explicitly named in the location parameter.
@@ -803,12 +807,23 @@ Parser version 1.0 is available for backward compatibility only, and should be u
803807
804808
## Use COPY INTO with OneLake
805809

806-
You can now use `COPY INTO` to load data directly from files stored in the Fabric OneLake, specifically from the **Files folder** of a Fabric Lakehouse. This eliminates the need for external staging accounts (such as ADLS Gen2 or Blob Storage) and enables workspace-governed, SaaS-native ingestion using Fabric permissions. This functionality supports:
810+
You can use `COPY INTO` to load data directly from files stored in the Fabric OneLake, specifically from the **Files folder** of a Fabric Lakehouse. This eliminates the need for external staging accounts (such as ADLS Gen2 or Blob Storage) and enables workspace-governed, SaaS-native ingestion using Fabric permissions. This functionality supports:
807811

808812
- Reading from `Files` folders in Lakehouses
809813
- Workspace-to-warehouse loads within the same tenant
810814
- Native identity enforcement using Microsoft Entra ID
811815

816+
Example:
817+
818+
```sql
819+
COPY INTO t1
820+
FROM 'https://onelake.dfs.fabric.microsoft.com/<workspaceId>/<lakehouseId>/Files/*.csv'
821+
WITH (
822+
FILE_TYPE = 'CSV',
823+
FIRSTROW = 2
824+
);
825+
```
826+
812827
> [!NOTE]
813828
> This feature is currently in [preview](/fabric/fundamentals/preview).
814829

0 commit comments

Comments
 (0)