Skip to content

Commit b5717be

Browse files
authored
Update concepts-change-data-capture.md
1 parent 35e170c commit b5717be

File tree

1 file changed

+1
-3
lines changed

1 file changed

+1
-3
lines changed

articles/data-factory/concepts-change-data-capture.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ When you perform data integration and ETL processes in the cloud, your jobs can
2626

2727
### Change Data Capture factory resource
2828

29-
The easiest and quickest way to get started in data factory with CDC is through the factory level Change Data Capture resource. From the main pipeline designer, click on New under Factory Resources to create a new Change Data Capture. The CDC factory resource will provide a configuration walk-through experience where you will point to your sources and destinations, apply optional transformations, and then click start to begin your data capture. With the CDC resource, you will not need to design pipelines or data flow activities and the only billing will be 4 cores of General Purpose data flows while your data in being processed. You set a latency which ADF will use to wake-up and look for changed data. That is the only time you will be billed. The top-level CDC resource is also the ADF method of running your processes continuously. Pipelines in ADF are batch only. But the CDC resource can run continuously.
29+
The easiest and quickest way to get started in data factory with CDC is through the factory level Change Data Capture resource. From the main pipeline designer, click on **New** under Factory Resources to create a new Change Data Capture. The CDC factory resource will provide a configuration walk-through experience where you will point to your sources and destinations, apply optional transformations, and then click start to begin your data capture. With the CDC resource, you will not need to design pipelines or data flow activities and the only billing will be 4 cores of General Purpose data flows while your data in being processed. You set a latency which ADF will use to wake-up and look for changed data. That is the only time you will be billed. The top-level CDC resource is also the ADF method of running your processes continuously. Pipelines in ADF are batch only, but the CDC resource can run continuously.
3030

3131
### Native change data capture in mapping data flow
3232

@@ -42,8 +42,6 @@ The changed data including inserted, updated and deleted rows can be automatical
4242
- [Azure Cosmos DB (SQL API)](connector-azure-cosmos-db.md)
4343
- [Azure Cosmos DB analytical store](../cosmos-db/analytical-store-introduction.md)
4444

45-
Azure Cosmos DB API for NoSQL
46-
4745
### Auto incremental extraction in mapping data flow
4846

4947
The newly updated rows or updated files can be automatically detected and extracted by ADF mapping data flow from the source stores. When you want to get delta data from the databases, the incremental column is required to identify the changes. When you want to load new files or updated files only from a storage store, ADF mapping data flow just works through files’ last modify time.

0 commit comments

Comments
 (0)