You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-salesforce-legacy.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
8
8
ms.subservice: data-movement
9
9
ms.topic: conceptual
10
10
ms.custom: synapse
11
-
ms.date: 01/08/2024
11
+
ms.date: 01/26/2024
12
12
---
13
13
14
14
# Copy data from and to Salesforce using Azure Data Factory or Azure Synapse Analytics (legacy)
@@ -19,7 +19,7 @@ ms.date: 01/08/2024
19
19
This article outlines how to use Copy Activity in Azure Data Factory and Azure Synapse pipelines to copy data from and to Salesforce. It builds on the [Copy Activity overview](copy-activity-overview.md) article that presents a general overview of the copy activity.
20
20
21
21
>[!IMPORTANT]
22
-
>The service has released a new Salesforce connector which provides better native Salesforce support comparing to this ODBC-based implementation, refer to [Salesforce connector](connector-salesforce.md) article on details. This legacy Salesforce connector is kept supported as-is for backward compatibility, while for any new workload, please use the new connector.
22
+
>The service has released a new Salesforce connector which provides better native Salesforce support, refer to [Salesforce connector](connector-salesforce.md) article on details. This legacy Salesforce connector is kept supported as-is for backward compatibility, while for any new workload, please use the new connector.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-salesforce-service-cloud-legacy.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
8
8
ms.subservice: data-movement
9
9
ms.topic: conceptual
10
10
ms.custom: synapse
11
-
ms.date: 01/15/2024
11
+
ms.date: 01/26/2024
12
12
---
13
13
14
14
# Copy data from and to Salesforce Service Cloud using Azure Data Factory or Synapse Analytics (legacy)
@@ -18,7 +18,7 @@ ms.date: 01/15/2024
18
18
This article outlines how to use Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from and to Salesforce Service Cloud. It builds on the [Copy Activity overview](copy-activity-overview.md) article that presents a general overview of the copy activity.
19
19
20
20
>[!IMPORTANT]
21
-
>The service has released a new Salesforce Service Cloud connector which provides better native Salesforce Service Cloud support comparing to this ODBC-based implementation, refer to [Salesforce Service Cloud connector](connector-salesforce-service-cloud.md) article on details. This legacy Salesforce Service Cloud connector is kept supported as-is for backward compatibility, while for any new workload, please use the new connector.
21
+
>The service has released a new Salesforce Service Cloud connector which provides better native Salesforce Service Cloud support, refer to [Salesforce Service Cloud connector](connector-salesforce-service-cloud.md) article on details. This legacy Salesforce Service Cloud connector is kept supported as-is for backward compatibility, while for any new workload, please use the new connector.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-salesforce-service-cloud.md
+10-8Lines changed: 10 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
8
8
ms.subservice: data-movement
9
9
ms.topic: conceptual
10
10
ms.custom: synapse
11
-
ms.date: 01/15/2024
11
+
ms.date: 01/26/2024
12
12
---
13
13
14
14
# Copy data from and to Salesforce Service Cloud using Azure Data Factory or Azure Synapse Analytics
@@ -37,7 +37,7 @@ For a list of data stores that are supported as sources or sinks, see the [Suppo
37
37
Specifically, this Salesforce Service Cloud connector supports:
38
38
39
39
- Salesforce Developer, Professional, Enterprise, or Unlimited editions.
40
-
- Copying data from and to custom domain.
40
+
- Copying data from and to custom domain (Custom domain can be configured in both production and sanbox environments).
41
41
42
42
You can explicitly set the API version used to read/write data via [`apiVersion` property](#linked-service-properties) in linked service. When copying data to Salesforce Service Cloud, the connector uses BULK API 2.0.
43
43
@@ -51,14 +51,13 @@ You can explicitly set the API version used to read/write data via [`apiVersion`
51
51
> - The execution user must have the API Only permission.
52
52
> - Access Token expire time could be changed through session policies instead of the refresh token.
53
53
54
-
## Salesforce request limits
54
+
## Salesforce Bulk API 2.0 Limits
55
55
56
-
Salesforce has limits for both total API requests and concurrent API requests. Note the following points:
56
+
We use Salesforce Bulk API 2.0 to query and ingest data. In Bulk API 2.0, batches are created for you automatically. You can submit up to **15,000** batches per rolling 24-hour period. If batches exceed the limit, you will see failures.
57
57
58
-
- If the number of concurrent requests exceeds the limit, throttling occurs and you see random failures.
59
-
- If the total number of requests exceeds the limit, the Salesforce Service Cloud account is blocked for 24 hours.
58
+
In Bulk API 2.0, only ingest jobs consume batches. Query jobs don't. For details, see [How Requests Are Processed in the Bulk API 2.0 Developer Guide](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/how_requests_are_processed.htm).
60
59
61
-
You might also receive the "REQUEST_LIMIT_EXCEEDED" error message in both scenarios. For more information, see the "API request limits" section in [Salesforce developer limits](https://developer.salesforce.com/docs/atlas.en-us.218.0.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_api.htm).
60
+
For more information, see the "General Limits" section in [Salesforce developer limits](https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_bulkapi.htm).
62
61
63
62
## Get started
64
63
@@ -98,9 +97,10 @@ The following properties are supported for the Salesforce Service Cloud linked s
98
97
|:--- |:--- |:--- |
99
98
| type |The type property must be set to **SalesforceServiceCloudV2**. |Yes |
100
99
| environmentUrl | Specify the URL of the Salesforce Service Cloud instance. <br>For example, specify `"https://<domainName>.my.salesforce.com"` to copy data from the custom domain. Learn how to configure or view your custom domain referring to this [article](https://help.salesforce.com/s/articleView?id=sf.domain_name_setting_login_policy.htm&type=5). |Yes |
100
+
| authenticationType | Type of authentication used to connect to the Salesforce Service Cloud. <br/>The allowed value is **OAuth2ClientCredentials**. | Yes |
101
101
| clientId |Specify the client ID of the Salesforce OAuth 2.0 Connected App. For more information, go to this [article](https://help.salesforce.com/s/articleView?id=sf.connected_app_client_credentials_setup.htm&type=5)|Yes |
102
102
| clientSecret |Specify the client secret of the Salesforce OAuth 2.0 Connected App. For more information, go to this [article](https://help.salesforce.com/s/articleView?id=sf.connected_app_client_credentials_setup.htm&type=5)|Yes |
103
-
| apiVersion | Specify the Salesforce Bulk API 2.0 version to use, e.g. `52.0`. The Bulk API 2.0 only support API version >= 47.0. To learn about Bulk API 2.0 version, see [article](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/bulk_common_diff_two_versions.htm). If you use a lower API version, it will result in a failure. | Yes |
103
+
| apiVersion | Specify the Salesforce Bulk API 2.0 version to use, e.g. `52.0`. The Bulk API 2.0 only supports API version >= 47.0. To learn about Bulk API 2.0 version, see [article](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/bulk_common_diff_two_versions.htm). If you use a lower API version, it will result in a failure. | Yes |
104
104
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No |
105
105
106
106
**Example: Store credentials**
@@ -112,6 +112,7 @@ The following properties are supported for the Salesforce Service Cloud linked s
112
112
"type": "SalesforceServiceCloudV2",
113
113
"typeProperties": {
114
114
"environmentUrl": "<environment URL>",
115
+
"authenticationType": "OAuth2ClientCredentials",
115
116
"clientId": "<client ID>",
116
117
"clientSecret": {
117
118
"type": "SecureString",
@@ -136,6 +137,7 @@ The following properties are supported for the Salesforce Service Cloud linked s
Copy file name to clipboardExpand all lines: articles/data-factory/connector-salesforce.md
+11-8Lines changed: 11 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
8
8
ms.subservice: data-movement
9
9
ms.topic: conceptual
10
10
ms.custom: synapse
11
-
ms.date: 01/08/2024
11
+
ms.date: 01/26/2024
12
12
---
13
13
14
14
# Copy data from and to Salesforce using Azure Data Factory or Azure Synapse Analytics
@@ -37,7 +37,7 @@ For a list of data stores that are supported as sources or sinks, see the [Suppo
37
37
Specifically, this Salesforce connector supports:
38
38
39
39
- Salesforce Developer, Professional, Enterprise, or Unlimited editions.
40
-
- Copying data from and to custom domain.
40
+
- Copying data from and to custom domain (Custom domain can be configured in both production and sanbox environments).
41
41
42
42
You can explicitly set the API version used to read/write data via [`apiVersion` property](#linked-service-properties) in linked service. When copying data to Salesforce, the connector uses BULK API 2.0.
43
43
@@ -51,14 +51,13 @@ You can explicitly set the API version used to read/write data via [`apiVersion`
51
51
> - The execution user must have the API Only permission.
52
52
> - Access Token expire time could be changed through session policies instead of the refresh token.
53
53
54
-
## Salesforce request limits
54
+
## Salesforce Bulk API 2.0 Limits
55
55
56
-
Salesforce has limits for both total API requests and concurrent API requests. Note the following points:
56
+
We use Salesforce Bulk API 2.0 to query and ingest data. In Bulk API 2.0, batches are created for you automatically. You can submit up to **15,000** batches per rolling 24-hour period. If batches exceed the limit, you will see failures.
57
57
58
-
- If the number of concurrent requests exceeds the limit, throttling occurs and you see random failures.
59
-
- If the total number of requests exceeds the limit, the Salesforce account is blocked for 24 hours.
58
+
In Bulk API 2.0, only ingest jobs consume batches. Query jobs don't. For details, see [How Requests Are Processed in the Bulk API 2.0 Developer Guide](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/how_requests_are_processed.htm).
60
59
61
-
You might also receive the "REQUEST_LIMIT_EXCEEDED" error message in both scenarios. For more information, see the "API request limits" section in [Salesforce developer limits](https://developer.salesforce.com/docs/atlas.en-us.218.0.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_api.htm).
60
+
For more information, see the "General Limits" section in [Salesforce developer limits](https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_bulkapi.htm).
62
61
63
62
## Get started
64
63
@@ -98,9 +97,10 @@ The following properties are supported for the Salesforce linked service.
98
97
|:--- |:--- |:--- |
99
98
| type |The type property must be set to **SalesforceV2**. |Yes |
100
99
| environmentUrl | Specify the URL of the Salesforce instance. <br>For example, specify `"https://<domainName>.my.salesforce.com"` to copy data from the custom domain. Learn how to configure or view your custom domain referring to this [article](https://help.salesforce.com/s/articleView?id=sf.domain_name_setting_login_policy.htm&type=5). |Yes |
100
+
| authenticationType | Type of authentication used to connect to the Salesforce. <br/>The allowed value is **OAuth2ClientCredentials**. | Yes |
101
101
| clientId |Specify the client ID of the Salesforce OAuth 2.0 Connected App. For more information, go to this [article](https://help.salesforce.com/s/articleView?id=sf.connected_app_client_credentials_setup.htm&type=5)|Yes |
102
102
| clientSecret |Specify the client secret of the Salesforce OAuth 2.0 Connected App. For more information, go to this [article](https://help.salesforce.com/s/articleView?id=sf.connected_app_client_credentials_setup.htm&type=5)|Yes |
103
-
| apiVersion | Specify the Salesforce Bulk API 2.0 version to use, e.g. `52.0`. The Bulk API 2.0 only support API version >= 47.0. To learn about Bulk API 2.0 version, see [article](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/bulk_common_diff_two_versions.htm). If you use a lower API version, it will result in a failure. | Yes |
103
+
| apiVersion | Specify the Salesforce Bulk API 2.0 version to use, e.g. `52.0`. The Bulk API 2.0 only supports API version >= 47.0. To learn about Bulk API 2.0 version, see [article](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/bulk_common_diff_two_versions.htm). If you use a lower API version, it will result in a failure. | Yes |
104
104
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No |
105
105
106
106
**Example: Store credentials**
@@ -112,6 +112,7 @@ The following properties are supported for the Salesforce linked service.
112
112
"type": "SalesforceV2",
113
113
"typeProperties": {
114
114
"environmentUrl": "<environment URL>",
115
+
"authenticationType": "OAuth2ClientCredentials",
115
116
"clientId": "<client ID>",
116
117
"clientSecret": {
117
118
"type": "SecureString",
@@ -136,6 +137,7 @@ The following properties are supported for the Salesforce linked service.
136
137
"type": "SalesforceV2",
137
138
"typeProperties": {
138
139
"environmentUrl": "<environment URL>",
140
+
"authenticationType": "OAuth2ClientCredentials",
139
141
"clientId": "<client ID>",
140
142
"clientSecret": {
141
143
"type": "AzureKeyVaultSecret",
@@ -173,6 +175,7 @@ Note that by doing so, you will no longer be able to use the UI to edit settings
173
175
"type": "LinkedServiceReference"
174
176
},
175
177
},
178
+
"authenticationType": "OAuth2ClientCredentials",
176
179
"clientId": {
177
180
"type": "AzureKeyVaultSecret",
178
181
"secretName": "<secret name of client ID in AKV>",
0 commit comments