Skip to content

Commit b28c86b

Browse files
Clare Zheng (Shanghai Wicresoft Co Ltd)Clare Zheng (Shanghai Wicresoft Co Ltd)
authored andcommitted
Update according to feedbacks
1 parent 0af0fcd commit b28c86b

File tree

2 files changed

+13
-13
lines changed

2 files changed

+13
-13
lines changed

articles/data-factory/connector-snowflake.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -282,9 +282,11 @@ If your sink data store and format meet the criteria described in this section,
282282

283283
- When you specify `storageIntegration` in the source:
284284

285-
The **sink linked service** is [**Azure Blob Storage**](connector-azure-blob-storage.md). If you want to directly copy data to Azure Data Lake Storage Gen2 in the following supported format, you can create an Azure Blob Storage linked service against your Azure Data Lake Storage Gen2 account, to avoid using [staged copy from Snowflake](#staged-copy-from-snowflake).
285+
The sink data store is the Azure Blob Storage that you referred in the external stage in Snowflake. You need to complete the following steps before copying data:
286286

287-
You need to grant at least **Storage Blob Data Contributor** role to the Snowflake service principal in the sink Azure Blob Storage or Azure Data Lake Storage Gen2 **Access Control (IAM)**.
287+
1. Create an [**Azure Blob Storage**](connector-azure-blob-storage.md) linked service for the sink Azure Blob Storage with any supported authentication types.
288+
289+
2. Grant at least **Storage Blob Data Contributor** role to the Snowflake service principal in the sink Azure Blob Storage **Access Control (IAM)**.
288290

289291
- When you don't specify `storageIntegration` in the source:
290292

@@ -354,11 +356,9 @@ When your sink data store or format isn't natively compatible with the Snowflake
354356

355357
To use this feature, create an [Azure Blob storage linked service](connector-azure-blob-storage.md#linked-service-properties) that refers to the Azure storage account as the interim staging. Then specify the `enableStaging` and `stagingSettings` properties in the Copy activity.
356358

357-
> [!NOTE]
358-
> When you specify `storageIntegration` in the source, ensure that you grant at least **Storage Blob Data Contributor** role to the Snowflake service principal in the staging Azure Blob Storage **Access Control (IAM)**.
359+
- When you specify `storageIntegration` in the source, the interim staging Azure Blob Storage should be the one that you referred in the external stage in Snowflake. Ensure that you create an [Azure Blob Storage](connector-azure-blob-storage.md) linked service for it with any supported authentication, and grant at least **Storage Blob Data Contributor** role to the Snowflake service principal in the staging Azure Blob Storage **Access Control (IAM)**.
359360

360-
> [!NOTE]
361-
> When you don't specify `storageIntegration` in the source, the staging Azure Blob Storage linked service must use shared access signature authentication, as required by the Snowflake COPY command. Make sure you grant proper access permission to Snowflake in the staging Azure Blob Storage. To learn more about this, see this [article](https://docs.snowflake.com/en/user-guide/data-load-azure-config.html#option-2-generating-a-sas-token).
361+
- When you don't specify `storageIntegration` in the source, the staging Azure Blob Storage linked service must use shared access signature authentication, as required by the Snowflake COPY command. Make sure you grant proper access permission to Snowflake in the staging Azure Blob Storage. To learn more about this, see this [article](https://docs.snowflake.com/en/user-guide/data-load-azure-config.html#option-2-generating-a-sas-token).
362362

363363
**Example:**
364364

@@ -438,9 +438,11 @@ If your source data store and format meet the criteria described in this section
438438

439439
- When you specify `storageIntegration` in the sink:
440440

441-
The **source linked service** is [**Azure Blob storage**](connector-azure-blob-storage.md). If you want to directly copy data from Azure Data Lake Storage Gen2 in the following supported format, you can create an Azure Blob Storage linked service against your Azure Data Lake Storage Gen2 account, to avoid using [staged copy to Snowflake](#staged-copy-to-snowflake).
441+
The source data store is the Azure Blob Storage that you referred in the external stage in Snowflake. You need to complete the following steps before copying data:
442+
443+
1. Create an [**Azure Blob Storage**](connector-azure-blob-storage.md) linked service for the source Azure Blob Storage with any supported authentication types.
442444

443-
You need to grant at least **Storage Blob Data Reader** role to the Snowflake service principal in the source Azure Blob Storage or Azure Data Lake Storage Gen2 **Access Control (IAM)**.
445+
2. Grant at least **Storage Blob Data Reader** role to the Snowflake service principal in the source Azure Blob Storage **Access Control (IAM)**.
444446

445447
- When you don't specify `storageIntegration` in the sink:
446448

@@ -514,11 +516,9 @@ When your source data store or format isn't natively compatible with the Snowfla
514516

515517
To use this feature, create an [Azure Blob storage linked service](connector-azure-blob-storage.md#linked-service-properties) that refers to the Azure storage account as the interim staging. Then specify the `enableStaging` and `stagingSettings` properties in the Copy activity.
516518

517-
> [!NOTE]
518-
> When you specify `storageIntegration` in the sink, ensure that you grant at least **Storage Blob Data Reader** role to the Snowflake service principal in the staging Azure Blob Storage **Access Control (IAM)**.
519+
- When you specify `storageIntegration` in the sink, the interim staging Azure Blob Storage should be the one that you referred in the external stage in Snowflake. Ensure that you create an [Azure Blob Storage](connector-azure-blob-storage.md) linked service for it with any supported authentication, and grant at least **Storage Blob Data Reader** role to the Snowflake service principal in the staging Azure Blob Storage **Access Control (IAM)**.
519520

520-
> [!NOTE]
521-
> When you don't specify `storageIntegration` in the sink, the staging Azure Blob Storage linked service need to use shared access signature authentication as required by the Snowflake COPY command.
521+
- When you don't specify `storageIntegration` in the sink, the staging Azure Blob Storage linked service need to use shared access signature authentication as required by the Snowflake COPY command.
522522

523523
**Example:**
524524

articles/data-factory/copy-activity-performance-features.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@ Configure the **enableStaging** setting in the copy activity to specify whether
153153
| --- | --- | --- | --- |
154154
| enableStaging |Specify whether you want to copy data via an interim staging store. |False |No |
155155
| linkedServiceName |Specify the name of an [Azure Blob storage](connector-azure-blob-storage.md#linked-service-properties) or [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md#linked-service-properties) linked service, which refers to the instance of Storage that you use as an interim staging store. |N/A |Yes, when **enableStaging** is set to TRUE |
156-
| path |Specify the path that you want to contain the staged data. If you don't provide a path, the service creates a container to store temporary data. When `storageIntegration` in Snowflake connector is specified, the path is required. |N/A |No |
156+
| path |Specify the path that you want to contain the staged data. If you don't provide a path, the service creates a container to store temporary data. |N/A |No (Yes when `storageIntegration` in Snowflake connector is specified) |
157157
| enableCompression |Specifies whether data should be compressed before it's copied to the destination. This setting reduces the volume of data being transferred. |False |No |
158158

159159
>[!NOTE]

0 commit comments

Comments
 (0)