You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-data-lake-storage.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
10
10
ms.workload: data-services
11
11
ms.topic: conceptual
12
12
ms.custom: seo-lt-2019
13
-
ms.date: 02/17/2020
13
+
ms.date: 03/24/2020
14
14
---
15
15
16
16
# Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory
@@ -34,7 +34,7 @@ For Copy activity, with this connector you can:
34
34
- Copy data from/to Azure Data Lake Storage Gen2 by using account key, service principal, or managed identities for Azure resources authentications.
35
35
- Copy files as-is or parse or generate files with [supported file formats and compression codecs](supported-file-formats-and-compression-codecs.md).
36
36
-[Preserve file metadata during copy](#preserve-metadata-during-copy).
37
-
-[Preserve ACLs](#preserve-metadata-during-copy) when copying from Azure Data Lake Storage Gen1.
37
+
-[Preserve ACLs](#preserve-acls) when copying from Azure Data Lake Storage Gen1/Gen2.
38
38
39
39
>[!IMPORTANT]
40
40
>If you enable the **Allow trusted Microsoft services to access this storage account** option on Azure Storage firewall settings and want to use Azure integration runtime to connect to your Data Lake Storage Gen2, you must use [managed identity authentication](#managed-identity) for ADLS Gen2.
@@ -375,13 +375,13 @@ This section describes the resulting behavior of the copy operation for differen
375
375
376
376
When you copy files from Amazon S3/Azure Blob/Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2/Azure Blob, you can choose to preserve the file metadata along with data. Learn more from [Preserve metadata](copy-activity-preserve-metadata.md#preserve-metadata).
377
377
378
-
## Preserve ACLs from Data Lake Storage Gen1
378
+
## <aname="preserve-acls"></a> Preserve ACLs from Data Lake Storage Gen1/Gen2
379
+
380
+
When you copy files from Azure Data Lake Storage Gen1/Gen2 to Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data. Learn more from [Preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2](copy-activity-preserve-metadata.md#preserve-acls).
379
381
380
382
>[!TIP]
381
383
>To copy data from Azure Data Lake Storage Gen1 into Gen2 in general, see [Copy data from Azure Data Lake Storage Gen1 to Gen2 with Azure Data Factory](load-azure-data-lake-storage-gen2-from-gen1.md) for a walk-through and best practices.
382
384
383
-
When you copy files from Azure Data Lake Storage Gen1 to Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data. Learn more from [Preserve ACLs from Data Lake Storage Gen1 to Gen2](copy-activity-preserve-metadata.md#preserve-acls).
384
-
385
385
## Mapping data flow properties
386
386
387
387
When transforming data in mapping data flow, you can read and write files from Azure Data Lake Storage Gen2 in JSON, Avro, Delimited Text, or Parquet format. For more information, see [source transformation](data-flow-source.md) and [sink transformation](data-flow-sink.md) in the mapping data flow feature.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-salesforce-service-cloud.md
+4-3Lines changed: 4 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
10
10
ms.workload: data-services
11
11
ms.topic: conceptual
12
12
ms.custom: seo-lt-2019
13
-
ms.date: 08/06/2019
13
+
ms.date: 03/24/2020
14
14
---
15
15
16
16
# Copy data from and to Salesforce Service Cloud by using Azure Data Factory
@@ -31,7 +31,7 @@ Specifically, this Salesforce Service Cloud connector supports:
31
31
- Salesforce Developer, Professional, Enterprise, or Unlimited editions.
32
32
- Copying data from and to Salesforce production, sandbox, and custom domain.
33
33
34
-
The Salesforce Service Cloud connector is built on top of the Salesforce REST/Bulk API, with [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm)for copy data from and [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm)for copy data to.
34
+
The Salesforce connector is built on top of the Salesforce REST/Bulk API. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm)to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm)to copy data to Salesforce. You can also explicitly set the API version used to read/write data via [`apiVersion` property](#linked-service-properties) in linked service.
35
35
36
36
## Prerequisites
37
37
@@ -62,7 +62,8 @@ The following properties are supported for the Salesforce linked service.
62
62
| environmentUrl | Specify the URL of the Salesforce Service Cloud instance. <br> - Default is `"https://login.salesforce.com"`. <br> - To copy data from sandbox, specify `"https://test.salesforce.com"`. <br> - To copy data from custom domain, specify, for example, `"https://[domain].my.salesforce.com"`. |No |
63
63
| username |Specify a user name for the user account. |Yes |
64
64
| password |Specify a password for the user account.<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
65
-
| securityToken |Specify a security token for the user account. For instructions on how to reset and get a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm).<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
65
+
| securityToken |Specify a security token for the user account. <br/><br/>To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm). The security token can be skipped only if you add the Integration Runtime's IP to the [trusted IP address list](https://developer.salesforce.com/docs/atlas.en-us.securityImplGuide.meta/securityImplGuide/security_networkaccess.htm) on Salesforce. When using Azure IR, refer to [Azure Integration Runtime IP addresses](azure-integration-runtime-ip-addresses.md).<br/><br/>For instructions on how to get and reset a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |No |
66
+
| apiVersion | Specify the Salesforce REST/Bulk API version to use, e.g. `48.0`. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) to copy data to Salesforce. | No |
66
67
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No for source, Yes for sink if the source linked service doesn't have integration runtime |
You can copy data from Salesforce to any supported sink data store. You also can copy data from any supported source data store to Salesforce. For a list of data stores that are supported as sources or sinks by the Copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
33
32
34
33
Specifically, this Salesforce connector supports:
35
34
36
35
- Salesforce Developer, Professional, Enterprise, or Unlimited editions.
37
36
- Copying data from and to Salesforce production, sandbox, and custom domain.
38
37
39
-
The Salesforce connector is built on top of the Salesforce REST/Bulk API, with [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm)for copy data from and [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm)for copy data to.
38
+
The Salesforce connector is built on top of the Salesforce REST/Bulk API. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm)to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm)to copy data to Salesforce. You can also explicitly set the API version used to read/write data via [`apiVersion` property](#linked-service-properties) in linked service.
40
39
41
40
## Prerequisites
42
41
@@ -67,7 +66,8 @@ The following properties are supported for the Salesforce linked service.
67
66
| environmentUrl | Specify the URL of the Salesforce instance. <br> - Default is `"https://login.salesforce.com"`. <br> - To copy data from sandbox, specify `"https://test.salesforce.com"`. <br> - To copy data from custom domain, specify, for example, `"https://[domain].my.salesforce.com"`. |No |
68
67
| username |Specify a user name for the user account. |Yes |
69
68
| password |Specify a password for the user account.<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
70
-
| securityToken |Specify a security token for the user account. For instructions on how to reset and get a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm).<br/><br/>Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
69
+
| securityToken |Specify a security token for the user account. <br/><br/>To learn about security tokens in general, see [Security and the API](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_concepts_security.htm). The security token can be skipped only if you add the Integration Runtime's IP to the [trusted IP address list](https://developer.salesforce.com/docs/atlas.en-us.securityImplGuide.meta/securityImplGuide/security_networkaccess.htm) on Salesforce. When using Azure IR, refer to [Azure Integration Runtime IP addresses](azure-integration-runtime-ip-addresses.md).<br/><br/>For instructions on how to get and reset a security token, see [Get a security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm). Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |No |
70
+
| apiVersion | Specify the Salesforce REST/Bulk API version to use, e.g. `48.0`. By default, the connector uses [v45](https://developer.salesforce.com/docs/atlas.en-us.218.0.api_rest.meta/api_rest/dome_versions.htm) to copy data from Salesforce, and uses [v40](https://developer.salesforce.com/docs/atlas.en-us.208.0.api_asynch.meta/api_asynch/asynch_api_intro.htm) to copy data to Salesforce. | No |
71
71
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No for source, Yes for sink if the source linked service doesn't have integration runtime |
Copy file name to clipboardExpand all lines: articles/data-factory/copy-activity-preserve-metadata.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.reviewer: douglasl
10
10
ms.service: data-factory
11
11
ms.workload: data-services
12
12
ms.topic: conceptual
13
-
ms.date: 12/12/2019
13
+
ms.date: 03/24/2020
14
14
ms.author: jingwang
15
15
16
16
---
@@ -36,7 +36,7 @@ Here's an example of copy activity JSON configuration (see `preserve`):
36
36
```json
37
37
"activities":[
38
38
{
39
-
"name": "CopyFromGen1ToGen2",
39
+
"name": "CopyAndPreserveMetadata",
40
40
"type": "Copy",
41
41
"typeProperties": {
42
42
"source": {
@@ -72,9 +72,9 @@ Here's an example of copy activity JSON configuration (see `preserve`):
72
72
]
73
73
```
74
74
75
-
## <aname="preserve-acls"></a> Preserve ACLs from Data Lake Storage Gen1 to Gen2
75
+
## <aname="preserve-acls"></a> Preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2
76
76
77
-
When you upgrade from Azure Data Lake Storage Gen1 to Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data files. For more information on access control, see [Access control in Azure Data Lake Storage Gen1](../data-lake-store/data-lake-store-access-control.md) and [Access control in Azure Data Lake Storage Gen2](../storage/blobs/data-lake-storage-access-control.md).
77
+
When you upgrade from Azure Data Lake Storage Gen1 to Gen2 or copy data between ADLS Gen2, you can choose to preserve the POSIX access control lists (ACLs) along with data files. For more information on access control, see [Access control in Azure Data Lake Storage Gen1](../data-lake-store/data-lake-store-access-control.md) and [Access control in Azure Data Lake Storage Gen2](../storage/blobs/data-lake-storage-access-control.md).
78
78
79
79
Copy activity supports preserving the following types of ACLs during data copy. You can select one or more types:
80
80
@@ -85,21 +85,21 @@ Copy activity supports preserving the following types of ACLs during data copy.
85
85
If you specify to copy from a folder, Data Factory replicates the ACLs for that given folder and the files and directories under it, if `recursive` is set to true. If you specify to copy from a single file, the ACLs on that file are copied.
86
86
87
87
>[!NOTE]
88
-
>When you use ADF to preserve ACLs from Data Lake Storage Gen1 to Gen2, the existing ACLs on Gen2's corresponding folder/files will be overwritten.
88
+
>When you use ADF to preserve ACLs from Data Lake Storage Gen1/Gen2 to Gen2, the existing ACLs on sink Gen2's corresponding folder/files will be overwritten.
89
89
90
90
>[!IMPORTANT]
91
91
>When you choose to preserve ACLs, make sure you grant high enough permissions for Data Factory to operate against your sink Data Lake Storage Gen2 account. For example, use account key authentication or assign the Storage Blob Data Owner role to the service principal or managed identity.
92
92
93
-
When you configure source as Data Lake Storage Gen1 with binary format or the binary copy option, and sink as Data Lake Storage Gen2 with binary format or the binary copy option, you can find the **Preserve** option on the **Settings** page in Copy Data Tool or on the **Copy Activity** > **Settings** tab for activity authoring.
93
+
When you configure source as Data Lake Storage Gen1/Gen2 with binary format or the binary copy option, and sink as Data Lake Storage Gen2 with binary format or the binary copy option, you can find the **Preserve** option on the **Settings** page in Copy Data Tool or on the **Copy Activity** > **Settings** tab for activity authoring.
94
94
95
-

95
+

96
96
97
97
Here's an example of copy activity JSON configuration (see `preserve`):
98
98
99
99
```json
100
100
"activities":[
101
101
{
102
-
"name": "CopyFromGen1ToGen2",
102
+
"name": "CopyAndPreserveACLs",
103
103
"type": "Copy",
104
104
"typeProperties": {
105
105
"source": {
@@ -123,7 +123,7 @@ Here's an example of copy activity JSON configuration (see `preserve`):
123
123
},
124
124
"inputs": [
125
125
{
126
-
"referenceName": "<Binary dataset name for Azure Data Lake Storage Gen1 source>",
126
+
"referenceName": "<Binary dataset name for Azure Data Lake Storage Gen1/Gen2 source>",
0 commit comments