Skip to content

Commit b356c11

Browse files
authored
Update connector-azure-data-lake-storage.md
1 parent b5db9f3 commit b356c11

File tree

1 file changed

+6
-7
lines changed

1 file changed

+6
-7
lines changed

articles/data-factory/connector-azure-data-lake-storage.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,6 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 02/17/2022
1211
---
1312

1413
# Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory or Azure Synapse Analytics
@@ -526,7 +525,7 @@ When you copy files from Azure Data Lake Storage Gen1/Gen2 to Gen2, you can choo
526525

527526
When you're transforming data in mapping data flows, you can read and write files from Azure Data Lake Storage Gen2 in the following formats:
528527
* [Avro](format-avro.md#mapping-data-flow-properties)
529-
* [Common Data Model (preview)](format-common-data-model.md#mapping-data-flow-properties)
528+
* [Common Data Model](format-common-data-model.md#mapping-data-flow-properties)
530529
* [Delimited text](format-delimited-text.md#mapping-data-flow-properties)
531530
* [Delta](format-delta.md#mapping-data-flow-properties)
532531
* [Excel](format-excel.md#mapping-data-flow-properties)
@@ -594,9 +593,9 @@ In this case, all files that were sourced under /data/sales are moved to /backup
594593
595594
**Filter by last modified:** You can filter which files you process by specifying a date range of when they were last modified. All date-times are in UTC.
596595

597-
**Enable change data capture (Preview):** If true, you will get new or changed files only from the last run. Initial load of full snapshot data will always be gotten in the first run, followed by capturing new or changed files only in next runs. For more details, see [Change data capture (preview)](#change-data-capture-preview).
596+
**Enable change data capture:** If true, you will get new or changed files only from the last run. Initial load of full snapshot data will always be gotten in the first run, followed by capturing new or changed files only in next runs. For more details, see [Change data capture](#change-data-capture).
598597

599-
:::image type="content" source="media/data-flow/enable-change-data-capture-preview.png" alt-text="Screenshot showing Enable change data capture (Preview).":::
598+
:::image type="content" source="media/data-flow/enable-change-data-capture-preview.png" alt-text="Screenshot showing Enable change data capture.":::
600599

601600
### Sink properties
602601

@@ -784,13 +783,13 @@ To learn details about the properties, check [Delete activity](delete-activity.m
784783
}
785784
]
786785
```
787-
## Change data capture (preview)
786+
## Change data capture
788787

789-
Azure Data Factory can get new or changed files only from Azure Data Lake Storage Gen2 by enabling **Enable change data capture (Preview)** in the mapping data flow source transformation. With this connector option, you can read new or updated files only and apply transformations before loading transformed data into destination datasets of your choice.
788+
Azure Data Factory can get new or changed files only from Azure Data Lake Storage Gen2 by enabling **Enable change data capture** in the mapping data flow source transformation. With this connector option, you can read new or updated files only and apply transformations before loading transformed data into destination datasets of your choice.
790789

791790
Make sure you keep the pipeline and activity name unchanged, so that the checkpoint can always be recorded from the last run to get changes from there. If you change your pipeline name or activity name, the checkpoint will be reset, and you will start from the beginning in the next run.
792791

793-
When you debug the pipeline, the **Enable change data capture (Preview)** works as well. Be aware that the checkpoint will be reset when you refresh your browser during the debug run. After you are satisfied with the result from debug run, you can publish and trigger the pipeline. It will always start from the beginning regardless of the previous checkpoint recorded by debug run.
792+
When you debug the pipeline, the **Enable change data capture** works as well. Be aware that the checkpoint will be reset when you refresh your browser during the debug run. After you are satisfied with the result from debug run, you can publish and trigger the pipeline. It will always start from the beginning regardless of the previous checkpoint recorded by debug run.
794793

795794
In the monitoring section, you always have the chance to rerun a pipeline. When you are doing so, the changes are always gotten from the checkpoint record in your selected pipeline run.
796795

0 commit comments

Comments
 (0)