You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
>When you use Copy Data Tool to author copy pipeline or use ADF UI to perform testing connection/navigating folders during authoring, it requires the permission of service principal or MSI being granted at root level. While, copy activity execution can work as long as the permission is granted to the data to be copied. You may skip the authoring operations if you need limit the permission.
46
+
44
47
The following sections provide details about properties that are used to define Data Factory entities specific to Azure Data lake Store.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-salesforce.md
+15-5Lines changed: 15 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
12
12
ms.tgt_pltfrm: na
13
13
ms.devlang: na
14
14
ms.topic: conceptual
15
-
ms.date: 07/18/2018
15
+
ms.date: 08/21/2018
16
16
ms.author: jingwang
17
17
18
18
---
@@ -180,7 +180,7 @@ To copy data from Salesforce, set the source type in the copy activity to **Sale
180
180
| Property | Description | Required |
181
181
|:--- |:--- |:--- |
182
182
| type | The type property of the copy activity source must be set to **SalesforceSource**. | Yes |
183
-
| query |Use the custom query to read data. You can use a SQL-92 query or [Salesforce Object Query Language (SOQL)](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql.htm) query. An example is `select * from MyTable__c`. | No (if "tableName" in the dataset is specified) |
183
+
| query |Use the custom query to read data. You can use [Salesforce Object Query Language (SOQL)](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql.htm) query or SQL-92 query. See more tips in [query tips](#query-tips) section. | No (if "tableName" in the dataset is specified) |
184
184
| readBehavior | Indicates whether to query the existing records, or query all records including the deleted ones. If not specified, the default behavior is the former. <br>Allowed values: **query** (default), **queryAll**. | No |
185
185
186
186
> [!IMPORTANT]
@@ -278,10 +278,20 @@ You can retrieve data from Salesforce reports by specifying a query as `{call "<
278
278
279
279
### Retrieve deleted records from the Salesforce Recycle Bin
280
280
281
-
To query the soft deleted records from the Salesforce Recycle Bin, you can specify **"IsDeleted = 1"** in your query. For example:
281
+
To query the soft deleted records from the Salesforce Recycle Bin, you can specify `readBehavior` as `queryAll`.
282
282
283
-
* To query only the deleted records, specify "select * from MyTable__c **where IsDeleted= 1**."
284
-
* To query all the records, including the existing and the deleted, specify "select * from MyTable__c **where IsDeleted = 0 or IsDeleted = 1**."
283
+
### Difference between SOQL and SQL query syntax
284
+
285
+
When copying data from Salesforce, you can use either SOQL query or SQL query. Note that these two has different syntax and functionality support, do not mix it. You are suggested to use the SOQL query which is natively supported by Salesforce. The following table lists the main differences:
286
+
287
+
| Syntax | SOQL Mode | SQL Mode |
288
+
|:--- |:--- |:--- |
289
+
| Column selection | Need to enumarate the fields to be copied in the query, e.g. `SELECT field1, filed2 FROM objectname`|`SELECT *` is supported in addition to column selection. |
290
+
| Quotation marks | Filed/object names cannot be quoted. | Field/object names can be quoted, e.g. `SELECT "id" FROM "Account"`|
291
+
| Datetime format | Refer to details [here](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_select_dateformats.htm) and samples in next section. | Refer to details [here](https://docs.microsoft.com/sql/odbc/reference/develop-app/date-time-and-timestamp-literals?view=sql-server-2017) and samples in next section. |
292
+
| Boolean values | Represented as `False` and `Ture`, e.g. `SELECT … WHERE IsDeleted=True`. | Represented as 0 or 1, e.g. `SELECT … WHERE IsDeleted=1`. |
293
+
| Column renaming | Not supported. | Supported, e.g.: `SELECT a AS b FROM …`. |
294
+
| Relationship | Supported, e.g. `Account_vod__r.nvs_Country__c`. | Not supported. |
285
295
286
296
### Retrieve data by using a where clause on the DateTime column
Copy file name to clipboardExpand all lines: articles/data-factory/supported-file-formats-and-compression-codecs.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,14 +8,14 @@ ms.reviewer: douglasl
8
8
ms.service: data-factory
9
9
ms.workload: data-services
10
10
ms.topic: conceptual
11
-
ms.date: 05/09/2018
11
+
ms.date: 08/21/2018
12
12
ms.author: jingwang
13
13
14
14
---
15
15
16
16
# Supported file formats and compression codecs in Azure Data Factory
17
17
18
-
*This topic applies to the following connectors: [Amazon S3](connector-amazon-simple-storage-service.md), [Azure Blob](connector-azure-blob-storage.md), [Azure Data Lake Store](connector-azure-data-lake-store.md), [Azure File Storage](connector-azure-file-storage.md), [File System](connector-file-system.md), [FTP](connector-ftp.md), [HDFS](connector-hdfs.md), [HTTP](connector-http.md), and [SFTP](connector-sftp.md).*
18
+
*This topic applies to the following connectors: [Amazon S3](connector-amazon-simple-storage-service.md), [Azure Blob](connector-azure-blob-storage.md), [Azure Data Lake Storage Gen1](connector-azure-data-lake-store.md), [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md), [Azure File Storage](connector-azure-file-storage.md), [File System](connector-file-system.md), [FTP](connector-ftp.md), [HDFS](connector-hdfs.md), [HTTP](connector-http.md), and [SFTP](connector-sftp.md).*
19
19
20
20
If you want to **copy files as-is** between file-based stores (binary copy), skip the format section in both input and output dataset definitions. If you want to **parse or generate files with a specific format**, Azure Data Factory supports the following file format types:
0 commit comments