Skip to content

Commit 1ed2116

Browse files
Merge pull request #216821 from kromerm/cdcupdates
Cdcupdates
2 parents fcc2112 + 6f1a929 commit 1ed2116

File tree

3 files changed

+11
-4
lines changed

3 files changed

+11
-4
lines changed

articles/data-factory/concepts-change-data-capture.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.service: data-factory
99
ms.subservice: data-movement
1010
ms.custom: synapse
1111
ms.topic: conceptual
12-
ms.date: 10/18/2022
12+
ms.date: 11/01/2022
1313
---
1414

1515
# Change data capture in Azure Data Factory and Azure Synapse Analytics
@@ -26,7 +26,7 @@ When you perform data integration and ETL processes in the cloud, your jobs can
2626

2727
### Native change data capture in mapping data flow
2828

29-
The changed data including inserted, updated and deleted rows can be automatically detected and extracted by ADF mapping data flow from the source databases. No timestamp or ID columns are required to identify the changes since it uses the native change data capture technology in the databases. By simply chaining a source transform and a sink transform reference to a database dataset in a mapping data flow, you will see the changes happened on the source database to be automatically applied to the target database, so that you can easily synchronize data between two tables. You can also add any transformations in between for any business logic to process the delta data.
29+
The changed data including inserted, updated and deleted rows can be automatically detected and extracted by ADF mapping data flow from the source databases. No timestamp or ID columns are required to identify the changes since it uses the native change data capture technology in the databases. By simply chaining a source transform and a sink transform reference to a database dataset in a mapping data flow, you will see the changes happened on the source database to be automatically applied to the target database, so that you can easily synchronize data between two tables. You can also add any transformations in between for any business logic to process the delta data. When defining your sink data destination, you can set insert, update, upsert, and delete operations in your sink without the need of an Alter Row transformation because ADF is able to automatically detect the row makers.
3030

3131
**Supported connectors**
3232
- [SAP CDC](connector-sap-change-data-capture.md)

articles/data-factory/data-flow-alter-row.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.service: data-factory
99
ms.subservice: data-flows
1010
ms.topic: conceptual
1111
ms.custom: synapse, ignite-2022
12-
ms.date: 08/03/2022
12+
ms.date: 11/01/2022
1313
---
1414

1515
# Alter row transformation in mapping data flow
@@ -26,6 +26,9 @@ Alter Row transformations only operate on database, REST, or Azure Cosmos DB sin
2626

2727
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4vJYc]
2828
29+
> [!NOTE]
30+
> An Alter Row transformation is not needed for Change Data Capture data flows that use native CDC sources like SQL Server or SAP. In those instances, ADF will automatically detect the row marker so Alter Row policies are unnecessary.
31+
2932
## Specify a default row policy
3033

3134
Create an Alter Row transformation and specify a row policy with a condition of `true()`. Each row that doesn't match any of the previously defined expressions will be marked for the specified row policy. By default, each row that doesn't match any conditional expression will be marked for `Insert`.

articles/data-factory/data-flow-sink.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.service: data-factory
99
ms.subservice: data-flows
1010
ms.topic: conceptual
1111
ms.custom: seo-lt-2019, ignite-2022
12-
ms.date: 10/26/2022
12+
ms.date: 11/01/2022
1313
---
1414

1515
# Sink transformation in mapping data flow
@@ -106,6 +106,10 @@ For example, if I specify a single key column of `column1` in a cache sink calle
106106
107107
**Write to activity output** The cached sink can optionally write your output data to the input of the next pipeline activity. This will allow you to quickly and easily pass data out of your data flow activity without needing to persist the data in a data store.
108108

109+
## Update method
110+
111+
For database sink types, the Settings tab will include an "Update method" property. The default is insert but also includes checkbox options for update, upsert, and delete. To utilize those additional options, you will need to add an [Alter Row transformation](data-flow-alter-row.md) before the sink. The Alter Row will allow you to define the conditions for each of the database actions. If your source is a native CDC enable source, then you can set the update methods without an Alter Row as ADF is already aware of the row markers for insert, update, upsert, and delete.
112+
109113
## Field mapping
110114

111115
Similar to a select transformation, on the **Mapping** tab of the sink, you can decide which incoming columns will get written. By default, all input columns, including drifted columns, are mapped. This behavior is known as *automapping*.

0 commit comments

Comments
 (0)