Skip to content

Commit 9126f57

Browse files
authored
Merge pull request #108810 from linda33wj/master
Update ADF connector articles
2 parents b6f0e36 + 37e830f commit 9126f57

File tree

4 files changed

+22
-21
lines changed

4 files changed

+22
-21
lines changed

articles/data-factory/connector-azure-data-lake-storage.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
1212
ms.custom: seo-lt-2019
13-
ms.date: 02/17/2020
13+
ms.date: 03/24/2020
1414
---
1515

1616
# Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory
@@ -34,7 +34,7 @@ For Copy activity, with this connector you can:
3434
- Copy data from/to Azure Data Lake Storage Gen2 by using account key, service principal, or managed identities for Azure resources authentications.
3535
- Copy files as-is or parse or generate files with [supported file formats and compression codecs](supported-file-formats-and-compression-codecs.md).
3636
- [Preserve file metadata during copy](#preserve-metadata-during-copy).
37-
- [Preserve ACLs](#preserve-metadata-during-copy) when copying from Azure Data Lake Storage Gen1.
37+
- [Preserve ACLs](#preserve-acls) when copying from Azure Data Lake Storage Gen1/Gen2.
3838

3939
>[!IMPORTANT]
4040
>If you enable the **Allow trusted Microsoft services to access this storage account** option on Azure Storage firewall settings and want to use Azure integration runtime to connect to your Data Lake Storage Gen2, you must use [managed identity authentication](#managed-identity) for ADLS Gen2.
@@ -375,13 +375,13 @@ This section describes the resulting behavior of the copy operation for differen
375375

376376
When you copy files from Amazon S3/Azure Blob/Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2/Azure Blob, you can choose to preserve the file metadata along with data. Learn more from [Preserve metadata](copy-activity-preserve-metadata.md#preserve-metadata).
377377

378-
## Preserve ACLs from Data Lake Storage Gen1
378+
## <a name="preserve-acls"></a> Preserve ACLs from Data Lake Storage Gen1/Gen2
379+
380+
When you copy files from Azure Data Lake Storage Gen1/Gen2 to Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data. Learn more from [Preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2](copy-activity-preserve-metadata.md#preserve-acls).
379381

380382
>[!TIP]
381383
>To copy data from Azure Data Lake Storage Gen1 into Gen2 in general, see [Copy data from Azure Data Lake Storage Gen1 to Gen2 with Azure Data Factory](load-azure-data-lake-storage-gen2-from-gen1.md) for a walk-through and best practices.
382384
383-
When you copy files from Azure Data Lake Storage Gen1 to Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data. Learn more from [Preserve ACLs from Data Lake Storage Gen1 to Gen2](copy-activity-preserve-metadata.md#preserve-acls).
384-
385385
## Mapping data flow properties
386386

387387
When transforming data in mapping data flow, you can read and write files from Azure Data Lake Storage Gen2 in JSON, Avro, Delimited Text, or Parquet format. For more information, see [source transformation](data-flow-source.md) and [sink transformation](data-flow-sink.md) in the mapping data flow feature.

articles/data-factory/connector-salesforce-service-cloud.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
1212
ms.custom: seo-lt-2019
13-
ms.date: 08/06/2019
13+
ms.date: 03/24/2020
1414
---
1515

1616
# Copy data from and to Salesforce Service Cloud by using Azure Data Factory
@@ -31,7 +31,7 @@ Specifically, this Salesforce Service Cloud connector supports:
3131
- Salesforce Developer, Professional, Enterprise, or Unlimited editions.
3232
- Copying data from and to Salesforce production, sandbox, and custom domain.
3333

34-
The Salesforce Service Cloud connector is built on top of the Salesforce REST/Bulk API, with [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) for copy data from and [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) for copy data to.
34+
The Salesforce connector is built on top of the Salesforce REST/Bulk API. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) to copy data to Salesforce. You can also explicitly set the API version used to read/write data via [`apiVersion` property](#linked-service-properties) in linked service.
3535

3636
## Prerequisites
3737

@@ -62,7 +62,8 @@ The following properties are supported for the Salesforce linked service.
6262
| environmentUrl | Specify the URL of the Salesforce Service Cloud instance. <br> - Default is `"https://login.salesforce.com"`. <br> - To copy data from sandbox, specify `"https://test.salesforce.com"`. <br> - To copy data from custom domain, specify, for example, `"https://[domain].my.salesforce.com"`. |No |
6363
| username |Specify a user name for the user account. |Yes |
6464
| password |Specify a password for the user account.<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
65-
| securityToken |Specify a security token for the user account. For instructions on how to reset and get a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm).<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
65+
| securityToken |Specify a security token for the user account. <br/><br/>To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm). The security token can be skipped only if you add the Integration Runtime's IP to the [trusted IP address list](https://developer.salesforce.com/docs/atlas.en-us.securityImplGuide.meta/securityImplGuide/security_networkaccess.htm) on Salesforce. When using Azure IR, refer to [Azure Integration Runtime IP addresses](azure-integration-runtime-ip-addresses.md).<br/><br/>For instructions on how to get and reset a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |No |
66+
| apiVersion | Specify the Salesforce REST/Bulk API version to use, e.g. `48.0`. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) to copy data to Salesforce. | No |
6667
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No for source, Yes for sink if the source linked service doesn't have integration runtime |
6768

6869
>[!IMPORTANT]

articles/data-factory/connector-salesforce.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
1212
ms.custom: seo-lt-2019
13-
ms.date: 08/01/2019
13+
ms.date: 03/24/2020
1414
---
1515

1616
# Copy data from and to Salesforce by using Azure Data Factory
@@ -28,15 +28,14 @@ This Salesforce connector is supported for the following activities:
2828
- [Copy activity](copy-activity-overview.md) with [supported source/sink matrix](copy-activity-overview.md)
2929
- [Lookup activity](control-flow-lookup-activity.md)
3030

31-
3231
You can copy data from Salesforce to any supported sink data store. You also can copy data from any supported source data store to Salesforce. For a list of data stores that are supported as sources or sinks by the Copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
3332

3433
Specifically, this Salesforce connector supports:
3534

3635
- Salesforce Developer, Professional, Enterprise, or Unlimited editions.
3736
- Copying data from and to Salesforce production, sandbox, and custom domain.
3837

39-
The Salesforce connector is built on top of the Salesforce REST/Bulk API, with [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) for copy data from and [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) for copy data to.
38+
The Salesforce connector is built on top of the Salesforce REST/Bulk API. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) to copy data to Salesforce. You can also explicitly set the API version used to read/write data via [`apiVersion` property](#linked-service-properties) in linked service.
4039

4140
## Prerequisites
4241

@@ -67,7 +66,8 @@ The following properties are supported for the Salesforce linked service.
6766
| environmentUrl | Specify the URL of the Salesforce instance. <br> - Default is `"https://login.salesforce.com"`. <br> - To copy data from sandbox, specify `"https://test.salesforce.com"`. <br> - To copy data from custom domain, specify, for example, `"https://[domain].my.salesforce.com"`. |No |
6867
| username |Specify a user name for the user account. |Yes |
6968
| password |Specify a password for the user account.<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
70-
| securityToken |Specify a security token for the user account. For instructions on how to reset and get a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm).<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
69+
| securityToken |Specify a security token for the user account. <br/><br/>To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm). The security token can be skipped only if you add the Integration Runtime's IP to the [trusted IP address list](https://developer.salesforce.com/docs/atlas.en-us.securityImplGuide.meta/securityImplGuide/security_networkaccess.htm) on Salesforce. When using Azure IR, refer to [Azure Integration Runtime IP addresses](azure-integration-runtime-ip-addresses.md).<br/><br/>For instructions on how to get and reset a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |No |
70+
| apiVersion | Specify the Salesforce REST/Bulk API version to use, e.g. `48.0`. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) to copy data to Salesforce. | No |
7171
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No for source, Yes for sink if the source linked service doesn't have integration runtime |
7272

7373
>[!IMPORTANT]

articles/data-factory/copy-activity-preserve-metadata.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.reviewer: douglasl
1010
ms.service: data-factory
1111
ms.workload: data-services
1212
ms.topic: conceptual
13-
ms.date: 12/12/2019
13+
ms.date: 03/24/2020
1414
ms.author: jingwang
1515

1616
---
@@ -36,7 +36,7 @@ Here's an example of copy activity JSON configuration (see `preserve`):
3636
```json
3737
"activities":[
3838
{
39-
"name": "CopyFromGen1ToGen2",
39+
"name": "CopyAndPreserveMetadata",
4040
"type": "Copy",
4141
"typeProperties": {
4242
"source": {
@@ -72,9 +72,9 @@ Here's an example of copy activity JSON configuration (see `preserve`):
7272
]
7373
```
7474

75-
## <a name="preserve-acls"></a> Preserve ACLs from Data Lake Storage Gen1 to Gen2
75+
## <a name="preserve-acls"></a> Preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2
7676

77-
When you upgrade from Azure Data Lake Storage Gen1 to Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data files. For more information on access control, see [Access control in Azure Data Lake Storage Gen1](../data-lake-store/data-lake-store-access-control.md) and [Access control in Azure Data Lake Storage Gen2](../storage/blobs/data-lake-storage-access-control.md).
77+
When you upgrade from Azure Data Lake Storage Gen1 to Gen2 or copy data between ADLS Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data files. For more information on access control, see [Access control in Azure Data Lake Storage Gen1](../data-lake-store/data-lake-store-access-control.md) and [Access control in Azure Data Lake Storage Gen2](../storage/blobs/data-lake-storage-access-control.md).
7878

7979
Copy activity supports preserving the following types of ACLs during data copy. You can select one or more types:
8080

@@ -85,21 +85,21 @@ Copy activity supports preserving the following types of ACLs during data copy.
8585
If you specify to copy from a folder, Data Factory replicates the ACLs for that given folder and the files and directories under it, if `recursive` is set to true. If you specify to copy from a single file, the ACLs on that file are copied.
8686

8787
>[!NOTE]
88-
>When you use ADF to preserve ACLs from Data Lake Storage Gen1 to Gen2, the existing ACLs on Gen2's corresponding folder/files will be overwritten.
88+
>When you use ADF to preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2, the existing ACLs on sink Gen2's corresponding folder/files will be overwritten.
8989
9090
>[!IMPORTANT]
9191
>When you choose to preserve ACLs, make sure you grant high enough permissions for Data Factory to operate against your sink Data Lake Storage Gen2 account. For example, use account key authentication or assign the Storage Blob Data Owner role to the service principal or managed identity.
9292
93-
When you configure source as Data Lake Storage Gen1 with binary format or the binary copy option, and sink as Data Lake Storage Gen2 with binary format or the binary copy option, you can find the **Preserve** option on the **Settings** page in Copy Data Tool or on the **Copy Activity** > **Settings** tab for activity authoring.
93+
When you configure source as Data Lake Storage Gen1/Gen2 with binary format or the binary copy option, and sink as Data Lake Storage Gen2 with binary format or the binary copy option, you can find the **Preserve** option on the **Settings** page in Copy Data Tool or on the **Copy Activity** > **Settings** tab for activity authoring.
9494

95-
![Data Lake Storage Gen1 to Gen2 Preserve ACL](./media/connector-azure-data-lake-storage/adls-gen2-preserve-acl.png)
95+
![Data Lake Storage Gen1/Gen2 to Gen2 Preserve ACL](./media/connector-azure-data-lake-storage/adls-gen2-preserve-acl.png)
9696

9797
Here's an example of copy activity JSON configuration (see `preserve`):
9898

9999
```json
100100
"activities":[
101101
{
102-
"name": "CopyFromGen1ToGen2",
102+
"name": "CopyAndPreserveACLs",
103103
"type": "Copy",
104104
"typeProperties": {
105105
"source": {
@@ -123,7 +123,7 @@ Here's an example of copy activity JSON configuration (see `preserve`):
123123
},
124124
"inputs": [
125125
{
126-
"referenceName": "<Binary dataset name for Azure Data Lake Storage Gen1 source>",
126+
"referenceName": "<Binary dataset name for Azure Data Lake Storage Gen1/Gen2 source>",
127127
"type": "DatasetReference"
128128
}
129129
],

0 commit comments

Comments
 (0)