Skip to content

Commit dfb9350

Browse files
authored
Merge pull request #104711 from linda33wj/master
Update ADF connector articles
2 parents c4544f4 + 8ea618b commit dfb9350

File tree

3 files changed

+10
-5
lines changed

3 files changed

+10
-5
lines changed

articles/data-factory/connector-azure-cosmos-db.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -313,7 +313,7 @@ To achieve schema-agnostic copy:
313313

314314
## Migrate from relational database to Cosmos DB
315315

316-
When migrating from a relational database e.g. SQL Server to Azure Cosmos DB, copy activity can easily map tabular data from source to flatten JSON documents in Cosmos DB. In some cases, you may want to redesign the data model to optimize it for the NoSQL use-cases according to [Data modeling in Azure Cosmos DB](../cosmos-db/modeling-data.md), for example, to denormalize the data by embedding all of the related sub-items within one JSON document. For such case, refer to [this blog post](https://medium.com/@ArsenVlad/denormalizing-via-embedding-when-copying-data-from-sql-to-cosmos-db-649a649ae0fb) with a walkthrough on how to achieve it using Azure Data Factory copy activity.
316+
When migrating from a relational database e.g. SQL Server to Azure Cosmos DB, copy activity can easily map tabular data from source to flatten JSON documents in Cosmos DB. In some cases, you may want to redesign the data model to optimize it for the NoSQL use-cases according to [Data modeling in Azure Cosmos DB](../cosmos-db/modeling-data.md), for example, to denormalize the data by embedding all of the related sub-items within one JSON document. For such case, refer to [this article](../cosmos-db/migrate-relational-to-cosmos-db-sql-api.md) with a walkthrough on how to achieve it using Azure Data Factory copy activity.
317317

318318
## Next steps
319319

articles/data-factory/connector-azure-data-explorer.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ ms.workload: data-services
1111
ms.devlang: na
1212
ms.topic: conceptual
1313
ms.custom: seo-lt-2019
14-
ms.date: 11/26/2019
14+
ms.date: 02/18/2020
1515
---
1616

1717
# Copy data to or from Azure Data Explorer by using Azure Data Factory
@@ -189,6 +189,7 @@ To copy data to Azure Data Explorer, set the type property in the copy activity
189189
|:--- |:--- |:--- |
190190
| type | The **type** property of the copy activity sink must be set to: **AzureDataExplorerSink**. | Yes |
191191
| ingestionMappingName | Name of a pre-created [mapping](/azure/kusto/management/mappings#csv-mapping) on a Kusto table. To map the columns from source to Azure Data Explorer (which applies to [all supported source stores and formats](copy-activity-overview.md#supported-data-stores-and-formats), including CSV/JSON/Avro formats), you can use the copy activity [column mapping](copy-activity-schema-and-type-mapping.md) (implicitly by name or explicitly as configured) and/or Azure Data Explorer mappings. | No |
192+
| additionalProperties | A property bag which can be used for specifying any of the ingestion properties which aren’t being set already by the Azure Data Explorer Sink. Specifically, it can be useful for specifying ingestion tags. Learn more from [Azure Data Explore data ingestion doc](https://kusto.azurewebsites.net/docs/management/data-ingestion/index.html). | No |
192193

193194
**Example:**
194195

@@ -203,7 +204,8 @@ To copy data to Azure Data Explorer, set the type property in the copy activity
203204
},
204205
"sink": {
205206
"type": "AzureDataExplorerSink",
206-
"ingestionMappingName": "<optional Azure Data Explorer mapping name>"
207+
"ingestionMappingName": "<optional Azure Data Explorer mapping name>",
208+
"additionalProperties": {<additional settings for data ingestion>}
207209
}
208210
},
209211
"inputs": [

articles/data-factory/connector-azure-table-storage.md

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -234,13 +234,16 @@ To copy data from Azure Table, set the source type in the copy activity to **Azu
234234

235235
### azureTableSourceQuery examples
236236

237-
If the Azure Table column is of the datetime type:
237+
>[!NOTE]
238+
>Azure Table query operation times out in 30 seconds as [enforced by Azure Table service](https://docs.microsoft.com/rest/api/storageservices/setting-timeouts-for-table-service-operations). Learn how to optimize the query from [Design for querying](../storage/tables/table-storage-design-for-query.md) article.
239+
240+
In Azure Data Factory, if you want to filter the data against a datetime type column, refer to this example:
238241

239242
```json
240243
"azureTableSourceQuery": "LastModifiedTime gt datetime'2017-10-01T00:00:00' and LastModifiedTime le datetime'2017-10-02T00:00:00'"
241244
```
242245

243-
If the Azure Table column is of the string type:
246+
If you want to filter the data against a string type column, refer to this example:
244247

245248
```json
246249
"azureTableSourceQuery": "LastModifiedTime ge '201710010000_0000' and LastModifiedTime le '201710010000_9999'"

0 commit comments

Comments
 (0)