You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[!INCLUDE [SQL Server Azure SQL Database Azure SQL Managed Instance](../../includes/applies-to-version/sql-asdb-asdbmi.md)]
31
+
[!INCLUDE [SQL Server Azure SQL Database Azure SQL Managed Instance](../../includes/applies-to-version/sql-asdb-asdbmi-fabricdw.md)]
32
32
33
33
Imports a data file into a database table or view in a user-specified format in [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)].
34
34
@@ -91,9 +91,6 @@ The `BULK INSERT` statement has different arguments and options in different pla
91
91
| Unsupported options |`*` wildcards in path |`*` wildcards in path |`DATA_SOURCE`, `FORMATFILE_DATA_SOURCE`, `ERRORFILE`, `ERRORFILE_DATA_SOURCE`|
92
92
| Enabled options but without effect |||`KEEPIDENTITY`, `FIRE_TRIGGERS`, `CHECK_CONSTRAINTS`, `TABLOCK`, `ORDER`, `ROWS_PER_BATCH`, `KILOBYTES_PER_BATCH`, and `BATCHSIZE` are not applicable. They will not throw a syntax error, but they will not have any effect |
93
93
94
-
> [!NOTE]
95
-
> The BULK INSERT statement is in [preview in Fabric Data Warehouse](https://blog.fabric.microsoft.com/blog/bulk-insert-statement-in-fabric-datawarehouse).
96
-
97
94
#### *database_name*
98
95
99
96
The database name in which the specified table or view resides. If not specified, *database_name* is the current database.
@@ -135,9 +132,12 @@ Fabric Warehouse supports `*` wildcards that can match any character in the URI,
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
140
+
141
141
#### BATCHSIZE = *batch_size*
142
142
143
143
Specifies the number of rows in a batch. Each batch is copied to the server as one transaction. If this fails, [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] commits or rolls back the transaction for every batch. By default, all data in the specified data file is one batch. For information about performance considerations, see [Performance considerations](#performance-considerations) later in this article.
@@ -159,6 +159,15 @@ A situation in which you might want constraints disabled (the default behavior)
159
159
160
160
Specifies the code page of the data in the data file. CODEPAGE is relevant only if the data contains **char**, **varchar**, or **text** columns with character values greater than **127** or less than **32**. For an example, see [Specify a code page](#d-specify-a-code-page).
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
170
+
162
171
CODEPAGE isn't a supported option on Linux for [!INCLUDE [sssql17-md](../../includes/sssql17-md.md)]. For [!INCLUDE[sssql19-md](../../includes/sssql19-md.md)], only the **'RAW'** option is allowed for CODEPAGE.
163
172
164
173
You should specify a collation name for each column in a [format file](../../relational-databases/import-export/use-a-format-file-to-bulk-import-data-sql-server.md).
@@ -174,6 +183,15 @@ You should specify a collation name for each column in a [format file](../../rel
174
183
175
184
Specifies that BULK INSERT performs the import operation using the specified data-file type value.
|**char** (default)|Character format.<br /><br />For more information, see [Use Character Format to Import or Export Data (SQL Server)](../../relational-databases/import-export/use-character-format-to-import-or-export-data-sql-server.md).|
@@ -187,13 +205,16 @@ Specifies that BULK INSERT performs the import operation using the specified dat
187
205
188
206
Specifies a named external data source pointing to the Azure Blob Storage location of the file that will be imported. The external data source must be created using the `TYPE = BLOB_STORAGE` option added in [!INCLUDE [sssql17-md](../../includes/sssql17-md.md)]. For more information, see [CREATE EXTERNAL DATA SOURCE](../../t-sql/statements/create-external-data-source-transact-sql.md). For an example, see [Import data from a file in Azure Blob Storage](#f-import-data-from-a-file-in-azure-blob-storage).
189
207
208
+
> [!NOTE]
209
+
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
210
+
190
211
```sql
191
212
CREATE EXTERNAL DATA SOURCE pandemicdatalake
192
-
WITH (LOCATION='https://pandemicdatalake.blob.core.windows.net/public/',TYPE=BLOB_STORAGE)
213
+
WITH (LOCATION='https://<data-lake>.blob.core.windows.net/public/',TYPE=BLOB_STORAGE)
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
246
+
217
247
The FIRSTROW attribute isn't intended to skip column headers. Skipping headers isn't supported by the BULK INSERT statement. If you choose to skip rows, the [!INCLUDE[ssDEnoversion](../../includes/ssdenoversion-md.md)] looks only at the field terminators, and doesn't validate the data in the fields of skipped rows.
218
248
219
249
#### FIRE_TRIGGERS
@@ -305,6 +335,15 @@ Beginning with [!INCLUDE [sssql17-md](../../includes/sssql17-md.md)], and in Azu
305
335
306
336
Specifies the field terminator to be used for **char** and **widechar** data files. The default field terminator is `\t` (tab character). For more information, see [Specify Field and Row Terminators (SQL Server)](../../relational-databases/import-export/specify-field-and-row-terminators-sql-server.md).
> Replace `<data-lake>.blob.core.windows.net` with an appropriate URL.
346
+
308
347
#### ROWTERMINATOR = '*row_terminator*'
309
348
310
349
Specifies the row terminator to be used for **char** and **widechar** data files. The default row terminator is `\r\n` (newline character). For more information, see [Specify Field and Row Terminators (SQL Server)](../../relational-databases/import-export/specify-field-and-row-terminators-sql-server.md).
0 commit comments