Skip to content

Commit 0d48b55

Browse files
authored
Merge pull request #49535 from linda33wj/master
Update ADF copy content
2 parents 18bc92a + 2daef5c commit 0d48b55

File tree

4 files changed

+21
-8
lines changed

4 files changed

+21
-8
lines changed

articles/data-factory/connector-azure-data-lake-store.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,9 @@ Specifically, this Azure Data Lake Store connector supports:
4141
4242
[!INCLUDE [data-factory-v2-connector-get-started](../../includes/data-factory-v2-connector-get-started.md)]
4343

44+
>[!NOTE]
45+
>When you use Copy Data Tool to author copy pipeline or use ADF UI to perform testing connection/navigating folders during authoring, it requires the permission of service principal or MSI being granted at root level. While, copy activity execution can work as long as the permission is granted to the data to be copied. You may skip the authoring operations if you need limit the permission.
46+
4447
The following sections provide details about properties that are used to define Data Factory entities specific to Azure Data lake Store.
4548

4649
## Linked service properties

articles/data-factory/connector-dynamics-crm-office-365.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -329,7 +329,7 @@ Configure the corresponding Data Factory data type in a dataset structure based
329329
| AttributeType.Double | Double |||
330330
| AttributeType.EntityName | String |||
331331
| AttributeType.Integer | Int32 |||
332-
| AttributeType.Lookup | Guid || ✓ (with single type associated) |
332+
| AttributeType.Lookup | Guid || ✓ (with single target associated) |
333333
| AttributeType.ManagedProperty | Boolean || |
334334
| AttributeType.Memo | String |||
335335
| AttributeType.Money | Decimal |||

articles/data-factory/connector-salesforce.md

Lines changed: 15 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313
ms.devlang: na
1414
ms.topic: conceptual
15-
ms.date: 07/18/2018
15+
ms.date: 08/21/2018
1616
ms.author: jingwang
1717

1818
---
@@ -180,7 +180,7 @@ To copy data from Salesforce, set the source type in the copy activity to **Sale
180180
| Property | Description | Required |
181181
|:--- |:--- |:--- |
182182
| type | The type property of the copy activity source must be set to **SalesforceSource**. | Yes |
183-
| query |Use the custom query to read data. You can use a SQL-92 query or [Salesforce Object Query Language (SOQL)](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql.htm) query. An example is `select * from MyTable__c`. | No (if "tableName" in the dataset is specified) |
183+
| query |Use the custom query to read data. You can use [Salesforce Object Query Language (SOQL)](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql.htm) query or SQL-92 query. See more tips in [query tips](#query-tips) section. | No (if "tableName" in the dataset is specified) |
184184
| readBehavior | Indicates whether to query the existing records, or query all records including the deleted ones. If not specified, the default behavior is the former. <br>Allowed values: **query** (default), **queryAll**. | No |
185185

186186
> [!IMPORTANT]
@@ -278,10 +278,20 @@ You can retrieve data from Salesforce reports by specifying a query as `{call "<
278278

279279
### Retrieve deleted records from the Salesforce Recycle Bin
280280

281-
To query the soft deleted records from the Salesforce Recycle Bin, you can specify **"IsDeleted = 1"** in your query. For example:
281+
To query the soft deleted records from the Salesforce Recycle Bin, you can specify `readBehavior` as `queryAll`.
282282

283-
* To query only the deleted records, specify "select * from MyTable__c **where IsDeleted= 1**."
284-
* To query all the records, including the existing and the deleted, specify "select * from MyTable__c **where IsDeleted = 0 or IsDeleted = 1**."
283+
### Difference between SOQL and SQL query syntax
284+
285+
When copying data from Salesforce, you can use either SOQL query or SQL query. Note that these two has different syntax and functionality support, do not mix it. You are suggested to use the SOQL query which is natively supported by Salesforce. The following table lists the main differences:
286+
287+
| Syntax | SOQL Mode | SQL Mode |
288+
|:--- |:--- |:--- |
289+
| Column selection | Need to enumarate the fields to be copied in the query, e.g. `SELECT field1, filed2 FROM objectname` | `SELECT *` is supported in addition to column selection. |
290+
| Quotation marks | Filed/object names cannot be quoted. | Field/object names can be quoted, e.g. `SELECT "id" FROM "Account"` |
291+
| Datetime format | Refer to details [here](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_select_dateformats.htm) and samples in next section. | Refer to details [here](https://docs.microsoft.com/sql/odbc/reference/develop-app/date-time-and-timestamp-literals?view=sql-server-2017) and samples in next section. |
292+
| Boolean values | Represented as `False` and `Ture`, e.g. `SELECT … WHERE IsDeleted=True`. | Represented as 0 or 1, e.g. `SELECT … WHERE IsDeleted=1`. |
293+
| Column renaming | Not supported. | Supported, e.g.: `SELECT a AS b FROM …`. |
294+
| Relationship | Supported, e.g. `Account_vod__r.nvs_Country__c`. | Not supported. |
285295

286296
### Retrieve data by using a where clause on the DateTime column
287297

articles/data-factory/supported-file-formats-and-compression-codecs.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,14 +8,14 @@ ms.reviewer: douglasl
88
ms.service: data-factory
99
ms.workload: data-services
1010
ms.topic: conceptual
11-
ms.date: 05/09/2018
11+
ms.date: 08/21/2018
1212
ms.author: jingwang
1313

1414
---
1515

1616
# Supported file formats and compression codecs in Azure Data Factory
1717

18-
*This topic applies to the following connectors: [Amazon S3](connector-amazon-simple-storage-service.md), [Azure Blob](connector-azure-blob-storage.md), [Azure Data Lake Store](connector-azure-data-lake-store.md), [Azure File Storage](connector-azure-file-storage.md), [File System](connector-file-system.md), [FTP](connector-ftp.md), [HDFS](connector-hdfs.md), [HTTP](connector-http.md), and [SFTP](connector-sftp.md).*
18+
*This topic applies to the following connectors: [Amazon S3](connector-amazon-simple-storage-service.md), [Azure Blob](connector-azure-blob-storage.md), [Azure Data Lake Storage Gen1](connector-azure-data-lake-store.md), [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md), [Azure File Storage](connector-azure-file-storage.md), [File System](connector-file-system.md), [FTP](connector-ftp.md), [HDFS](connector-hdfs.md), [HTTP](connector-http.md), and [SFTP](connector-sftp.md).*
1919

2020
If you want to **copy files as-is** between file-based stores (binary copy), skip the format section in both input and output dataset definitions. If you want to **parse or generate files with a specific format**, Azure Data Factory supports the following file format types:
2121

0 commit comments

Comments
 (0)