Skip to content

Commit 6ae49a8

Browse files
Merge pull request #33818 from jovanpop-msft/patch-50
BULK INSERT GA in Fabric DW
2 parents 461d351 + c1d666a commit 6ae49a8

File tree

1 file changed

+47
-8
lines changed

1 file changed

+47
-8
lines changed

docs/t-sql/statements/bulk-insert-transact-sql.md

Lines changed: 47 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Transact-SQL reference for the BULK INSERT statement.
44
author: markingmyname
55
ms.author: maghan
66
ms.reviewer: randolphwest, wiassaf
7-
ms.date: 02/12/2025
7+
ms.date: 04/17/2025
88
ms.service: sql
99
ms.subservice: t-sql
1010
ms.topic: reference
@@ -28,7 +28,7 @@ monikerRange: "=azuresqldb-current || =azure-sqldw-latest || >=sql-server-2016 |
2828
---
2929
# BULK INSERT (Transact-SQL)
3030

31-
[!INCLUDE [SQL Server Azure SQL Database Azure SQL Managed Instance](../../includes/applies-to-version/sql-asdb-asdbmi.md)]
31+
[!INCLUDE [SQL Server Azure SQL Database Azure SQL Managed Instance](../../includes/applies-to-version/sql-asdb-asdbmi-fabricdw.md)]
3232

3333
Imports a data file into a database table or view in a user-specified format in [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)].
3434

@@ -91,9 +91,6 @@ The `BULK INSERT` statement has different arguments and options in different pla
9191
| Unsupported options | `*` wildcards in path | `*` wildcards in path | `DATA_SOURCE`, `FORMATFILE_DATA_SOURCE`, `ERRORFILE`, `ERRORFILE_DATA_SOURCE` |
9292
| Enabled options but without effect | | | `KEEPIDENTITY`, `FIRE_TRIGGERS`, `CHECK_CONSTRAINTS`, `TABLOCK`, `ORDER`, `ROWS_PER_BATCH`, `KILOBYTES_PER_BATCH`, and `BATCHSIZE` are not applicable. They will not throw a syntax error, but they will not have any effect |
9393

94-
> [!NOTE]
95-
> The BULK INSERT statement is in [preview in Fabric Data Warehouse](https://blog.fabric.microsoft.com/blog/bulk-insert-statement-in-fabric-datawarehouse).
96-
9794
#### *database_name*
9895

9996
The database name in which the specified table or view resides. If not specified, *database_name* is the current database.
@@ -135,9 +132,12 @@ Fabric Warehouse supports `*` wildcards that can match any character in the URI,
135132

136133
```sql
137134
BULK INSERT bing_covid_19_data
138-
FROM 'https://pandemicdatalake.blob.core.windows.net/public/curated/covid-19/bing_covid-19_data/latest/*.csv';
135+
FROM 'https://<data-lake>.blob.core.windows.net/public/curated/covid-19/bing_covid-19_data/latest/*.csv';
139136
```
140137

138+
> [!NOTE]
139+
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
140+
141141
#### BATCHSIZE = *batch_size*
142142

143143
Specifies the number of rows in a batch. Each batch is copied to the server as one transaction. If this fails, [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] commits or rolls back the transaction for every batch. By default, all data in the specified data file is one batch. For information about performance considerations, see [Performance considerations](#performance-considerations) later in this article.
@@ -159,6 +159,15 @@ A situation in which you might want constraints disabled (the default behavior)
159159

160160
Specifies the code page of the data in the data file. CODEPAGE is relevant only if the data contains **char**, **varchar**, or **text** columns with character values greater than **127** or less than **32**. For an example, see [Specify a code page](#d-specify-a-code-page).
161161

162+
```sql
163+
BULK INSERT bing_covid_19_data
164+
FROM 'https://<data-lake>.blob.core.windows.net/public/curated/covid-19/bing_covid-19_data/latest/bing_covid-19_data.csv'
165+
WITH (CODEPAGE = '65001', FIRSTROW=2);
166+
```
167+
168+
> [!NOTE]
169+
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
170+
162171
CODEPAGE isn't a supported option on Linux for [!INCLUDE [sssql17-md](../../includes/sssql17-md.md)]. For [!INCLUDE[sssql19-md](../../includes/sssql19-md.md)], only the **'RAW'** option is allowed for CODEPAGE.
163172

164173
You should specify a collation name for each column in a [format file](../../relational-databases/import-export/use-a-format-file-to-bulk-import-data-sql-server.md).
@@ -174,6 +183,15 @@ You should specify a collation name for each column in a [format file](../../rel
174183

175184
Specifies that BULK INSERT performs the import operation using the specified data-file type value.
176185

186+
```sql
187+
BULK INSERT bing_covid_19_data
188+
FROM 'https://<data-lake>.blob.core.windows.net/public/curated/covid-19/bing_covid-19_data/latest/bing_covid-19_data.csv'
189+
WITH (DATAFILETYPE = 'char', FIRSTROW=2);
190+
```
191+
192+
> [!NOTE]
193+
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
194+
177195
|DATAFILETYPE value|All data represented in:|
178196
|------------------------|------------------------------|
179197
|**char** (default)|Character format.<br /><br />For more information, see [Use Character Format to Import or Export Data &#40;SQL Server&#41;](../../relational-databases/import-export/use-character-format-to-import-or-export-data-sql-server.md).|
@@ -187,13 +205,16 @@ Specifies that BULK INSERT performs the import operation using the specified dat
187205

188206
Specifies a named external data source pointing to the Azure Blob Storage location of the file that will be imported. The external data source must be created using the `TYPE = BLOB_STORAGE` option added in [!INCLUDE [sssql17-md](../../includes/sssql17-md.md)]. For more information, see [CREATE EXTERNAL DATA SOURCE](../../t-sql/statements/create-external-data-source-transact-sql.md). For an example, see [Import data from a file in Azure Blob Storage](#f-import-data-from-a-file-in-azure-blob-storage).
189207

208+
> [!NOTE]
209+
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
210+
190211
```sql
191212
CREATE EXTERNAL DATA SOURCE pandemicdatalake
192-
WITH (LOCATION='https://pandemicdatalake.blob.core.windows.net/public/',TYPE=BLOB_STORAGE)
213+
WITH (LOCATION='https://<data-lake>.blob.core.windows.net/public/',TYPE=BLOB_STORAGE)
193214
GO
194215
BULK INSERT bing_covid_19_data
195216
FROM 'curated/covid-19/bing_covid-19_data/latest/bing_covid-19_data.csv'
196-
WITH (DATA_SOURCE='pandemicdatalake',FIRSTROW = 2,LASTROW = 100,FIELDTERMINATOR = ',');
217+
WITH (DATA_SOURCE='<data-lake>',FIRSTROW = 2,LASTROW = 100,FIELDTERMINATOR = ',');
197218
```
198219

199220
#### ERRORFILE = '*error_file_path*'
@@ -214,6 +235,15 @@ Specifies a named external data source pointing to the Azure Blob Storage locati
214235

215236
Specifies the number of the first row to load. The default is the first row in the specified data file. FIRSTROW is 1-based.
216237

238+
```sql
239+
BULK INSERT bing_covid_19_data
240+
FROM 'https://<data-lake>.blob.core.windows.net/public/curated/covid-19/bing_covid-19_data/latest/bing_covid-19_data.csv'
241+
WITH (FIRSTROW=2);
242+
```
243+
244+
> [!NOTE]
245+
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
246+
217247
The FIRSTROW attribute isn't intended to skip column headers. Skipping headers isn't supported by the BULK INSERT statement. If you choose to skip rows, the [!INCLUDE[ssDEnoversion](../../includes/ssdenoversion-md.md)] looks only at the field terminators, and doesn't validate the data in the fields of skipped rows.
218248

219249
#### FIRE_TRIGGERS
@@ -305,6 +335,15 @@ Beginning with [!INCLUDE [sssql17-md](../../includes/sssql17-md.md)], and in Azu
305335

306336
Specifies the field terminator to be used for **char** and **widechar** data files. The default field terminator is `\t` (tab character). For more information, see [Specify Field and Row Terminators &#40;SQL Server&#41;](../../relational-databases/import-export/specify-field-and-row-terminators-sql-server.md).
307337

338+
```sql
339+
BULK INSERT bing_covid_19_data
340+
FROM 'https://<data-lake>.blob.core.windows.net/public/curated/covid-19/bing_covid-19_data/latest/bing_covid-19_data.csv'
341+
WITH (FIELDTERMINATOR = ',', FIRSTROW=2);
342+
```
343+
344+
> [!NOTE]
345+
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
346+
308347
#### ROWTERMINATOR = '*row_terminator*'
309348

310349
Specifies the row terminator to be used for **char** and **widechar** data files. The default row terminator is `\r\n` (newline character). For more information, see [Specify Field and Row Terminators &#40;SQL Server&#41;](../../relational-databases/import-export/specify-field-and-row-terminators-sql-server.md).

0 commit comments

Comments
 (0)