Skip to content

Commit 8e60bb0

Browse files
authored
Update analytical-store-change-data-capture.md
1 parent de27a43 commit 8e60bb0

File tree

1 file changed

+6
-5
lines changed

1 file changed

+6
-5
lines changed

articles/cosmos-db/analytical-store-change-data-capture.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -27,10 +27,10 @@ In addition to providing incremental data feed from analytical store to diverse
2727
- Supports capturing deletes and intermediate updates
2828
- Ability to filter the change feed for a specific type of operation (**Insert** | **Update** | **Delete** | **TTL**)
2929
- Supports applying filters, projections and transformations on the Change feed via source query
30+
- Multiple change feeds on the same container can be consumed simultaneously
3031
- Each change in container appears exactly once in the change data capture feed, and the checkpoints are managed internally for you
3132
- Changes can be synchronized "from the Beginning” or “from a given timestamp” or “from now”
3233
- There's no limitation around the fixed data retention period for which changes are available
33-
- Multiple change feeds on the same container can be consumed simultaneously
3434

3535
> [!IMPORTANT]
3636
> Please note that "from the beginning" means that all data and all transactions since the container creation are availble for CDC, including deletes and updates. To ingest and process deletes and updates, you have to use specific settings in your CDC processes in Azure Synapse or Azure Data Factory. These settings are turned off by default. For more information, click [here](get-started-change-data-capture.md)
@@ -60,6 +60,11 @@ WHERE Category = 'Urban'
6060
> [!NOTE]
6161
> If you would like to enable source-query based change data capture on Azure Data Factory data flows during preview, please email [[email protected]](mailto:[email protected]) and share your **subscription Id** and **region**. This is not necessary to enable source-query based change data capture on an Azure Synapse data flow.
6262
63+
### Multiple CDC processes
64+
65+
You can create multiple processes to consume CDC in analytical store. This approach brings flexibility to support different scenarios and requirements. While one process may have no data transformations and multiple sinks, another one can have data flattening and one sink. And they can run in parallel.
66+
67+
6368
### Throughput isolation, lower latency and lower TCO
6469

6570
Operations on Cosmos DB analytical store don't consume the provisioned RUs and so don't affect your transactional workloads. change data capture with analytical store also has lower latency and lower TCO. The lower latency is attributed to analytical store enabling better parallelism for data processing and reduces the overall TCO enabling you to drive cost efficiencies in these rapidly shifting economic conditions.
@@ -84,10 +89,6 @@ You can use analytical store change data capture, if you're currently using or p
8489

8590
Change data capture capability enables an end-to-end analytical solution providing you with the flexibility to use Azure Cosmos DB data with any of the supported sink types. For more information on supported sink types, see [data flow supported sink types](../data-factory/data-flow-sink.md#supported-sinks). Change data capture also enables you to bring Azure Cosmos DB data into a centralized data lake and join the data with data from other diverse sources. You can flatten the data, partition it, and apply more transformations either in Azure Synapse Analytics or Azure Data Factory.
8691

87-
### Multiple CDC processes
88-
89-
You can create multiple processes to consume CDC in analytical store. This approach brings flexibility to support different scenarios and requirements. While one process may have no data transformations and multiple sinks, another one can have data flattening and one sink. And they can run in parallel.
90-
9192
## Change data capture on Azure Cosmos DB for MongoDB containers
9293

9394
The linked service interface for the API for MongoDB isn't available within Azure Data Factory data flows yet. You can use your API for MongoDB's account endpoint with the **Azure Cosmos DB for NoSQL** linked service interface as a work around until the Mongo linked service is directly supported.

0 commit comments

Comments
 (0)