Skip to content

Commit f64e0c7

Browse files
authored
Update connector-azure-cosmos-analytical-store.md
updated
1 parent 0b5bfbd commit f64e0c7

File tree

1 file changed

+13
-1
lines changed

1 file changed

+13
-1
lines changed

articles/data-factory/connector-azure-cosmos-analytical-store.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ ms.custom:
1111
ms.date: 03/31/2023
1212
---
1313

14-
# Copy and transform data in Azure Cosmos DB for NoSQL by using Azure Data Factory
14+
# Copy and transform data in Azure Cosmos DB analytical store by using Azure Data Factory
1515

1616
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
1717
> * [Current version](connector-azure-cosmos-analytical-store.md)
@@ -80,5 +80,17 @@ Settings specific to Azure Cosmos DB are available in the **Settings** tab of th
8080

8181
**Write throughput budget:** An integer that represents the RUs you want to allocate for this Data Flow write operation, out of the total throughput allocated to the collection.
8282

83+
## Azure Cosmos DB change feed
84+
85+
Azure Data Factory can get data from [Azure Cosmos DB change feed](../cosmos-db/change-feed.md) by enabling it in the mapping data flow source transformation. With this connector option, you can read change feeds and apply transformations before loading transformed data into destination datasets of your choice. You do not have to use Azure functions to read the change feed and then write custom transformations. You can use this option to move data from one container to another, prepare change feed driven material views for fit purpose or automate container backup or recovery based on change feed, and enable many more such use cases using visual drag and drop capability of Azure Data Factory.
86+
87+
Make sure you keep the pipeline and activity name unchanged, so that the checkpoint can be recorded by ADF for you to get changed data from the last run automatically. If you change your pipeline name or activity name, the checkpoint will be reset, which leads you to start from beginning or get changes from now in the next run.
88+
89+
When you debug the pipeline, this feature works the same. Be aware that the checkpoint will be reset when you refresh your browser during the debug run. After you are satisfied with the pipeline result from debug run, you can go ahead to publish and trigger the pipeline. At the moment when you first time trigger your published pipeline, it automatically restarts from the beginning or gets changes from now on.
90+
91+
In the monitoring section, you always have the chance to rerun a pipeline. When you are doing so, the changed data is always captured from the previous checkpoint of your selected pipeline run.
92+
93+
In addition, Azure Cosmos DB analytical store now supports Change Data Capture (CDC) for Azure Cosmos DB API for NoSQL and Azure Cosmos DB API for Mongo DB (public preview). Azure Cosmos DB analytical store allows you to efficiently consume a continuous and incremental feed of changed (inserted, updated, and deleted) data from analytical store.
94+
8395
## Next steps
8496
Get started with [change data capture in Azure Cosmos DB analytical store ](../cosmos-db/get-started-change-data-capture.md).

0 commit comments

Comments
 (0)