You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| type | The type property of the Copy activity source must be set to **SnowflakeV2Source**. | Yes |
264
-
| query | Specifies the SQL query to read data from Snowflake. If the names of the schema, table and columns contain lower case, quote the object identifier in query e.g. `select * from "schema"."myTable"`.<br>Executing stored procedure is not supported. | No |
264
+
| query | Specifies the SQL query to read data from Snowflake. If the names of the schema, table and columns contain lower case, quote the object identifier in query e.g. `select * from "schema"."myTable"`.<br>Executing stored procedure isn't supported. | No |
265
265
| exportSettings | Advanced settings used to retrieve data from Snowflake. You can configure the ones supported by the COPY into command that the service will pass through when you invoke the statement. | Yes |
266
266
|***Under `exportSettings`:***|||
267
267
| type | The type of export command, set to **SnowflakeExportCopyCommand**. | Yes |
@@ -275,7 +275,7 @@ To copy data from Snowflake, the following properties are supported in the Copy
275
275
276
276
#### Direct copy from Snowflake
277
277
278
-
If your sink data store and format meet the criteria described in this section, you can use the Copy activity to directly copy from Snowflake to sink. The service checks the settings and fails the Copy activity run if the following criteria is not met:
278
+
If your sink data store and format meet the criteria described in this section, you can use the Copy activity to directly copy from Snowflake to sink. The service checks the settings and fails the Copy activity run if the following criteria isn't met:
279
279
280
280
- The **sink linked service** is [**Azure Blob storage**](connector-azure-blob-storage.md) with **shared access signature** authentication. If you want to directly copy data to Azure Data Lake Storage Gen2 in the following supported format, you can create an Azure Blob linked service with SAS authentication against your ADLS Gen2 account, to avoid using [staged copy from Snowflake](#staged-copy-from-snowflake).
281
281
@@ -291,8 +291,8 @@ If your sink data store and format meet the criteria described in this section,
291
291
-`compression` can be **no compression**, **gzip**, **bzip2**, or **deflate**.
292
292
-`encodingName` is left as default or set to **utf-8**.
293
293
-`filePattern` in copy activity sink is left as default or set to **setOfObjects**.
294
-
- In copy activity source, `additionalColumns`is not specified.
295
-
- Column mapping is not specified.
294
+
- In copy activity source, `additionalColumns`isn't specified.
295
+
- Column mapping isn't specified.
296
296
297
297
**Example:**
298
298
@@ -338,7 +338,7 @@ If your sink data store and format meet the criteria described in this section,
338
338
339
339
#### Staged copy from Snowflake
340
340
341
-
When your sink data store or format is not natively compatible with the Snowflake COPY command, as mentioned in the last section, enable the built-in staged copy using an interim Azure Blob storage instance. The staged copy feature also provides you with better throughput. The service exports data from Snowflake into staging storage, then copies the data to sink, and finally cleans up your temporary data from the staging storage. See [Staged copy](copy-activity-performance-features.md#staged-copy) for details about copying data by using staging.
341
+
When your sink data store or format isn't natively compatible with the Snowflake COPY command, as mentioned in the last section, enable the built-in staged copy using an interim Azure Blob storage instance. The staged copy feature also provides you with better throughput. The service exports data from Snowflake into staging storage, then copies the data to sink, and finally cleans up your temporary data from the staging storage. See [Staged copy](copy-activity-performance-features.md#staged-copy) for details about copying data by using staging.
342
342
343
343
To use this feature, create an [Azure Blob storage linked service](connector-azure-blob-storage.md#linked-service-properties) that refers to the Azure storage account as the interim staging. Then specify the `enableStaging` and `stagingSettings` properties in the Copy activity.
344
344
@@ -417,7 +417,7 @@ To copy data to Snowflake, the following properties are supported in the Copy ac
417
417
418
418
#### Direct copy to Snowflake
419
419
420
-
If your source data store and format meet the criteria described in this section, you can use the Copy activity to directly copy from source to Snowflake. The service checks the settings and fails the Copy activity run if the following criteria is not met:
420
+
If your source data store and format meet the criteria described in this section, you can use the Copy activity to directly copy from source to Snowflake. The service checks the settings and fails the Copy activity run if the following criteria isn't met:
421
421
422
422
- The **source linked service** is [**Azure Blob storage**](connector-azure-blob-storage.md) with **shared access signature** authentication. If you want to directly copy data from Azure Data Lake Storage Gen2 in the following supported format, you can create an Azure Blob linked service with SAS authentication against your ADLS Gen2 account, to avoid using [staged copy to Snowflake](#staged-copy-to-snowflake).
423
423
@@ -426,20 +426,20 @@ If your source data store and format meet the criteria described in this section
426
426
- For **Parquet** format, the compression codec is **None**, or **Snappy**.
427
427
428
428
- For **delimited text** format:
429
-
-`rowDelimiter` is **\r\n**, or any single character. If row delimiter is not “\r\n”, `firstRowAsHeader` need to be **false**, and `skipLineCount`is not specified.
429
+
-`rowDelimiter` is **\r\n**, or any single character. If row delimiter isn't “\r\n”, `firstRowAsHeader` need to be **false**, and `skipLineCount`isn't specified.
430
430
-`compression` can be **no compression**, **gzip**, **bzip2**, or **deflate**.
431
431
-`encodingName` is left as default or set to "UTF-8", "UTF-16", "UTF-16BE", "UTF-32", "UTF-32BE", "BIG5", "EUC-JP", "EUC-KR", "GB18030", "ISO-2022-JP", "ISO-2022-KR", "ISO-8859-1", "ISO-8859-2", "ISO-8859-5", "ISO-8859-6", "ISO-8859-7", "ISO-8859-8", "ISO-8859-9", "WINDOWS-1250", "WINDOWS-1251", "WINDOWS-1252", "WINDOWS-1253", "WINDOWS-1254", "WINDOWS-1255".
432
432
-`quoteChar` is **double quote**, **single quote**, or **empty string** (no quote char).
433
433
- For **JSON** format, direct copy only supports the case that sink Snowflake table only has single column and the data type of this column is **VARIANT**, **OBJECT**, or **ARRAY**.
434
434
-`compression` can be **no compression**, **gzip**, **bzip2**, or **deflate**.
435
435
-`encodingName` is left as default or set to **utf-8**.
436
-
- Column mapping is not specified.
436
+
- Column mapping isn't specified.
437
437
438
438
- In the Copy activity source:
439
439
440
-
-`additionalColumns`is not specified.
440
+
-`additionalColumns`isn't specified.
441
441
- If your source is a folder, `recursive` is set to true.
442
-
-`prefix`, `modifiedDateTimeStart`, `modifiedDateTimeEnd`, and `enablePartitionDiscovery`are not specified.
442
+
-`prefix`, `modifiedDateTimeStart`, `modifiedDateTimeEnd`, and `enablePartitionDiscovery`aren't specified.
443
443
444
444
**Example:**
445
445
@@ -484,7 +484,7 @@ If your source data store and format meet the criteria described in this section
484
484
485
485
#### Staged copy to Snowflake
486
486
487
-
When your source data store or format is not natively compatible with the Snowflake COPY command, as mentioned in the last section, enable the built-in staged copy using an interim Azure Blob storage instance. The staged copy feature also provides you with better throughput. The service automatically converts the data to meet the data format requirements of Snowflake. It then invokes the COPY command to load data into Snowflake. Finally, it cleans up your temporary data from the blob storage. See [Staged copy](copy-activity-performance-features.md#staged-copy) for details about copying data using staging.
487
+
When your source data store or format isn't natively compatible with the Snowflake COPY command, as mentioned in the last section, enable the built-in staged copy using an interim Azure Blob storage instance. The staged copy feature also provides you with better throughput. The service automatically converts the data to meet the data format requirements of Snowflake. It then invokes the COPY command to load data into Snowflake. Finally, it cleans up your temporary data from the blob storage. See [Staged copy](copy-activity-performance-features.md#staged-copy) for details about copying data using staging.
488
488
489
489
To use this feature, create an [Azure Blob storage linked service](connector-azure-blob-storage.md#linked-service-properties) that refers to the Azure storage account as the interim staging. Then specify the `enableStaging` and `stagingSettings` properties in the Copy activity.
0 commit comments