You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/cosmos-db/analytical-store-change-data-capture.md
+6-5Lines changed: 6 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,10 +27,10 @@ In addition to providing incremental data feed from analytical store to diverse
27
27
- Supports capturing deletes and intermediate updates
28
28
- Ability to filter the change feed for a specific type of operation (**Insert** | **Update** | **Delete** | **TTL**)
29
29
- Supports applying filters, projections and transformations on the Change feed via source query
30
+
- Multiple change feeds on the same container can be consumed simultaneously
30
31
- Each change in container appears exactly once in the change data capture feed, and the checkpoints are managed internally for you
31
32
- Changes can be synchronized "from the Beginning” or “from a given timestamp” or “from now”
32
33
- There's no limitation around the fixed data retention period for which changes are available
33
-
- Multiple change feeds on the same container can be consumed simultaneously
34
34
35
35
> [!IMPORTANT]
36
36
> Please note that "from the beginning" means that all data and all transactions since the container creation are availble for CDC, including deletes and updates. To ingest and process deletes and updates, you have to use specific settings in your CDC processes in Azure Synapse or Azure Data Factory. These settings are turned off by default. For more information, click [here](get-started-change-data-capture.md)
@@ -60,6 +60,11 @@ WHERE Category = 'Urban'
60
60
> [!NOTE]
61
61
> If you would like to enable source-query based change data capture on Azure Data Factory data flows during preview, please email [[email protected]](mailto:[email protected]) and share your **subscription Id** and **region**. This is not necessary to enable source-query based change data capture on an Azure Synapse data flow.
62
62
63
+
### Multiple CDC processes
64
+
65
+
You can create multiple processes to consume CDC in analytical store. This approach brings flexibility to support different scenarios and requirements. While one process may have no data transformations and multiple sinks, another one can have data flattening and one sink. And they can run in parallel.
66
+
67
+
63
68
### Throughput isolation, lower latency and lower TCO
64
69
65
70
Operations on Cosmos DB analytical store don't consume the provisioned RUs and so don't affect your transactional workloads. change data capture with analytical store also has lower latency and lower TCO. The lower latency is attributed to analytical store enabling better parallelism for data processing and reduces the overall TCO enabling you to drive cost efficiencies in these rapidly shifting economic conditions.
@@ -84,10 +89,6 @@ You can use analytical store change data capture, if you're currently using or p
84
89
85
90
Change data capture capability enables an end-to-end analytical solution providing you with the flexibility to use Azure Cosmos DB data with any of the supported sink types. For more information on supported sink types, see [data flow supported sink types](../data-factory/data-flow-sink.md#supported-sinks). Change data capture also enables you to bring Azure Cosmos DB data into a centralized data lake and join the data with data from other diverse sources. You can flatten the data, partition it, and apply more transformations either in Azure Synapse Analytics or Azure Data Factory.
86
91
87
-
### Multiple CDC processes
88
-
89
-
You can create multiple processes to consume CDC in analytical store. This approach brings flexibility to support different scenarios and requirements. While one process may have no data transformations and multiple sinks, another one can have data flattening and one sink. And they can run in parallel.
90
-
91
92
## Change data capture on Azure Cosmos DB for MongoDB containers
92
93
93
94
The linked service interface for the API for MongoDB isn't available within Azure Data Factory data flows yet. You can use your API for MongoDB's account endpoint with the **Azure Cosmos DB for NoSQL** linked service interface as a work around until the Mongo linked service is directly supported.
0 commit comments