You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-cosmos-db-mongodb-api.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.service: multiple
10
10
ms.workload: data-services
11
11
ms.tgt_pltfrm: na
12
12
ms.topic: conceptual
13
-
ms.date: 08/01/2019
13
+
ms.date: 11/20/2019
14
14
ms.author: jingwang
15
15
16
16
---
@@ -166,7 +166,7 @@ The following properties are supported in the Copy Activity **sink** section:
166
166
| Property | Description | Required |
167
167
|:--- |:--- |:--- |
168
168
| type | The **type** property of the Copy Activity sink must be set to **CosmosDbMongoDbApiSink**. |Yes |
169
-
| writeBehavior |Describes how to write data to Azure Cosmos DB. Allowed values: **insert** and **upsert**.<br/><br/>The behavior of **upsert** is to replace the document if a document with the same ID already exists; otherwise, insert the document.<br /><br />**Note**: Data Factory automatically generates an ID for a document if an ID isn't specified either in the original document or by column mapping. This means that you must ensure that, for **upsert** to work as expected, your document has an ID. |No<br />(the default is **insert**) |
169
+
| writeBehavior |Describes how to write data to Azure Cosmos DB. Allowed values: **insert** and **upsert**.<br/><br/>The behavior of **upsert** is to replace the document if a document with the same `_id` already exists; otherwise, insert the document.<br /><br />**Note**: Data Factory automatically generates an `_id` for a document if an `_id` isn't specified either in the original document or by column mapping. This means that you must ensure that, for **upsert** to work as expected, your document has an ID. |No<br />(the default is **insert**) |
170
170
| writeBatchSize | The **writeBatchSize** property controls the size of documents to write in each batch. You can try increasing the value for **writeBatchSize** to improve performance and decreasing the value if your document size being large. |No<br />(the default is **10,000**) |
171
171
| writeBatchTimeout | The wait time for the batch insert operation to finish before it times out. The allowed value is timespan. | No<br/>(the default is **00:30:00** - 30 minutes) |
Copy file name to clipboardExpand all lines: articles/data-factory/connector-db2.md
+3-1Lines changed: 3 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
12
12
ms.tgt_pltfrm: na
13
13
14
14
ms.topic: conceptual
15
-
ms.date: 09/04/2019
15
+
ms.date: 11/20/2019
16
16
17
17
ms.author: jingwang
18
18
@@ -74,6 +74,8 @@ The following properties are supported for DB2 linked service:
74
74
| authenticationType |Type of authentication used to connect to the DB2 database.<br/>Allowed value is: **Basic**. |Yes |
75
75
| username |Specify user name to connect to the DB2 database. |Yes |
76
76
| password |Specify password for the user account you specified for the username. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
77
+
| packageCollection | Specify under where the needed packages are auto created by ADF when querying the database | No |
78
+
| certificateCommonName | When you use Secure Sockets Layer (SSL) or Transport Layer Security (TLS) encryption, you must enter a value for Certificate common name. | No |
77
79
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. Learn more from [Prerequisites](#prerequisites) section. If not specified, it uses the default Azure Integration Runtime. |No |
|Common Data Service <br> Dynamics 365 online <br> Dynamics CRM Online |AAD service principal <br> Office365 |[Dynamics online + AAD service principal or Office365 auth](#dynamics-365-and-dynamics-crm-online)|
42
42
| Dynamics 365 on-premises with IFD <br> Dynamics CRM 2016 on-premises with IFD <br> Dynamics CRM 2015 on-premises with IFD | IFD |[Dynamics on-premises with IFD + IFD auth](#dynamics-365-and-dynamics-crm-on-premises-with-ifd)|
43
43
44
44
For Dynamics 365 specifically, the following application types are supported:
@@ -73,22 +73,76 @@ The following properties are supported for the Dynamics linked service.
73
73
| type | The type property must be set to **Dynamics**, **DynamicsCrm**, or **CommonDataServiceForApps**. | Yes |
74
74
| deploymentType | The deployment type of the Dynamics instance. It must be **"Online"** for Dynamics online. | Yes |
75
75
| serviceUri | The service URL of your Dynamics instance, e.g. `https://adfdynamics.crm.dynamics.com`. | Yes |
76
-
| authenticationType | The authentication type to connect to a Dynamics server. Specify **"Office365"** for Dynamics online. | Yes |
77
-
| username | Specify the user name to connect to Dynamics. | Yes |
78
-
| password | Specify the password for the user account you specified for username. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
76
+
| authenticationType | The authentication type to connect to a Dynamics server. Allowed values are: **AADServicePrincipal** or **"Office365"**. | Yes |
77
+
| servicePrincipalId | Specify the Azure Active Directory application's client ID. | Yes when using `AADServicePrincipal` authentication |
78
+
| servicePrincipalCredentialType | Specify the credential type to use for service principal authentication. Allowed values are: **ServicePrincipalKey** or **ServicePrincipalCert**. | Yes when using `AADServicePrincipal` authentication |
79
+
| servicePrincipalCredential | Specify the service principal credential. <br>When using `ServicePrincipalKey` as credential type, `servicePrincipalCredential` can be a string (ADF will encrypt it upon linked service deployment) or a reference to a secret in AKV. <br>When using `ServicePrincipalCert` as credential, `servicePrincipalCredential` should be a reference to a certificate in AKV. | Yes when using `AADServicePrincipal` authentication |
80
+
| username | Specify the user name to connect to Dynamics. | Yes when using `Office365` authentication |
81
+
| password | Specify the password for the user account you specified for username. Mark this field as a SecureString to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | Yes when using `Office365` authentication |
79
82
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime. | No for source, Yes for sink if the source linked service doesn't have an integration runtime |
80
83
81
84
>[!NOTE]
82
85
>The Dynamics connector used to use optional "organizationName" property to identify your Dynamics CRM/365 Online instance. While it keeps working, you are suggested to specify the new "serviceUri" property instead to gain better performance for instance discovery.
83
86
87
+
**Example: Dynamics online using AAD service principal + key authentication**
| type | The type property under `location` in dataset must be set to **HttpServerLocation**. | Yes |
178
-
| relativeUrl | A relative URL to the resource that contains the data. | No |
178
+
| relativeUrl | A relative URL to the resource that contains the data. The HTTP connector copies data from the combined URL: `[URL specified in linked service]/[relative URL specified in dataset]`.| No |
179
179
180
180
> [!NOTE]
181
181
> The supported HTTP request payload size is around 500 KB. If the payload size you want to pass to your web endpoint is larger than 500 KB, consider batching the payload in smaller chunks.
@@ -281,7 +281,7 @@ The following properties are supported for HTTP under `storeSettings` settings i
281
281
| requestMethod | The HTTP method. <br>Allowed values are **Get** (default) and **Post**. | No |
282
282
| addtionalHeaders | Additional HTTP request headers. | No |
283
283
| requestBody | The body for the HTTP request. | No |
284
-
|requestTimeout| The timeout (the **TimeSpan** value) for the HTTP request to get a response. This value is the timeout to get a response, not the timeout to read response data. The default value is **00:01:40**. | No |
284
+
|httpRequestTimeout| The timeout (the **TimeSpan** value) for the HTTP request to get a response. This value is the timeout to get a response, not the timeout to read response data. The default value is **00:01:40**. | No |
285
285
| maxConcurrentConnections | The number of the connections to connect to storage store concurrently. Specify only when you want to limit the concurrent connection to the data store. | No |
Copy file name to clipboardExpand all lines: articles/data-factory/connector-rest.md
+5-2Lines changed: 5 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.service: data-factory
10
10
ms.workload: data-services
11
11
ms.tgt_pltfrm: na
12
12
ms.topic: conceptual
13
-
ms.date: 09/04/2019
13
+
ms.date: 11/20/2019
14
14
ms.author: jingwang
15
15
---
16
16
# Copy data from a REST endpoint by using Azure Data Factory
@@ -168,7 +168,7 @@ To copy data from REST, the following properties are supported:
168
168
| Property | Description | Required |
169
169
|:--- |:--- |:--- |
170
170
| type | The **type** property of the dataset must be set to **RestResource**. | Yes |
171
-
| relativeUrl | A relative URL to the resource that contains the data. When this property isn't specified, only the URL that's specified in the linked service definition is used. | No |
171
+
| relativeUrl | A relative URL to the resource that contains the data. When this property isn't specified, only the URL that's specified in the linked service definition is used. The HTTP connector copies data from the combined URL: `[URL specified in linked service]/[relative URL specified in dataset]`. | No |
172
172
173
173
If you were setting `requestMethod`, `additionalHeaders`, `requestBody` and `paginationRules` in dataset, it is still supported as-is, while you are suggested to use the new model in activity source going forward.
174
174
@@ -211,6 +211,9 @@ The following properties are supported in the copy activity **source** section:
211
211
| httpRequestTimeout | The timeout (the **TimeSpan** value) for the HTTP request to get a response. This value is the timeout to get a response, not the timeout to read response data. The default value is **00:01:40**. | No |
212
212
| requestInterval | The time to wait before sending the request for next page. The default value is **00:00:01**| No |
213
213
214
+
>[!NOTE]
215
+
>REST connector ignores any "Accept" header specified in `additionalHeaders`. As REST connector only support response in JSON, tt will auto generate a header of `Accept: application/json`.
216
+
214
217
**Example 1: Using the Get method with pagination**
Copy file name to clipboardExpand all lines: articles/data-factory/control-flow-get-metadata-activity.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ ms.workload: data-services
13
13
ms.tgt_pltfrm: na
14
14
15
15
ms.topic: conceptual
16
-
ms.date: 08/12/2019
16
+
ms.date: 11/20/2019
17
17
ms.author: jingwang
18
18
19
19
---
@@ -54,6 +54,7 @@ The Get Metadata activity takes a dataset as an input and returns metadata infor
54
54
55
55
- For Amazon S3 and Google Cloud Storage, `lastModified` applies to the bucket and the key but not to the virtual folder, and `exists` applies to the bucket and the key but not to the prefix or virtual folder.
56
56
- For Azure Blob storage, `lastModified` applies to the container and the blob but not to the virtual folder.
57
+
- Wildcard filter on folders/files is not supported for Get Metadata activity.
0 commit comments