Skip to content

Commit b480ad5

Browse files
committed
edits
1 parent cd20af1 commit b480ad5

11 files changed

+39
-37
lines changed

articles/data-factory/TOC.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1268,10 +1268,10 @@ items:
12681268
href: sap-change-data-capture-prepare-linked-service-source-dataset.md
12691269
- name: Debug the Data Factory copy activity
12701270
href: sap-change-data-capture-debug-shir-logs.md
1271-
- name: Use the SAP ODP (preview) data partitioning template
1271+
- name: Use the SAP data partitioning template
12721272
href: sap-change-data-capture-data-partitioning-template.md
1273-
- name: Use the SAP ODP (preview) data replication template
1273+
- name: Use the SAP data replication template
12741274
href: sap-change-data-capture-data-replication-template.md
1275-
- name: Manage your SAP CDC solution
1275+
- name: Manage your solution
12761276
href: sap-change-data-capture-management.md
12771277
displayName: SAP, change data capture, CDC

articles/data-factory/sap-change-data-capture-data-partitioning-template.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Auto-generate a pipeline by using the SAP data partitioning template
33
titleSuffix: Azure Data Factory
4-
description: Learn how to use the SAP data partitioning template for SAP change data capture (CDC) extraction in Azure Data Factory.
4+
description: Learn how to use the SAP data partitioning template for SAP change data capture (CDC) (preview) extraction in Azure Data Factory.
55
author: ukchrist
66
ms.service: data-factory
77
ms.subservice: data-movement
@@ -54,4 +54,4 @@ To auto-generate an Azure Data Factory pipeline by using the SAP data partitioni
5454

5555
## Next steps
5656

57-
[Auto-generate a pipeline from the SAP data replication template](sap-change-data-capture-data-replication-template.md)
57+
[Auto-generate a pipeline by using the SAP data replication template](sap-change-data-capture-data-replication-template.md)

articles/data-factory/sap-change-data-capture-data-replication-template.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Auto-generate a pipeline by using the SAP data replication template
33
titleSuffix: Azure Data Factory
4-
description: Learn how to use the SAP data replication template for SAP change data capture (CDC) extraction in Azure Data Factory.
4+
description: Learn how to use the SAP data replication template for SAP change data capture (CDC) (preview) extraction in Azure Data Factory.
55
author: ukchrist
66
ms.service: data-factory
77
ms.subservice: data-movement
@@ -18,7 +18,7 @@ Learn how to use the SAP data replication template to auto-generate a pipeline a
1818

1919
## Create a data replication pipeline from a template
2020

21-
To auto-generate an Azure Data Factory pipeline by using the SAP data partitioning template:
21+
To auto-generate an Azure Data Factory pipeline by using the SAP data replication template:
2222

2323
1. In Azure Data Factory Studio, go to the Author hub of your data factory. In **Factory Resources**, under **Pipelines** > **Pipelines Actions**, select **Pipeline from template**.
2424

@@ -58,7 +58,7 @@ To auto-generate an Azure Data Factory pipeline by using the SAP data partitioni
5858

5959
If you want to replicate SAP data to Data Lake Storage Gen2 in delta format, complete the steps that are detailed in the preceding section, but instead use the **Replicate SAP data to Azure Data Lake Store Gen2 in Delta format and persist raw data in CSV format** template.
6060

61-
Like in the data replication template, in a data delta pipeline, the Data Factory copy activity runs on the self-hosted integration runtime to extract raw data (full and deltas) from the SAP system. The copy activity loads the raw data into Data Lake Storage Gen2 as a persisted CSV file Historical changes are archived and preserved. The files are stored in the *sapcdc* container under the *deltachange/\<your pipeline name\>\<your pipeline run timestamp\>* folder path. The **Extraction mode** property of the copy activity is set to **Delta**. The **Subscriber process** property of copy activity is parameterized.
61+
Like in the data replication template, in a data delta pipeline, the Data Factory copy activity runs on the self-hosted integration runtime to extract raw data (full and deltas) from the SAP system. The copy activity loads the raw data into Data Lake Storage Gen2 as a persisted CSV file. Historical changes are archived and preserved. The files are stored in the *sapcdc* container in the *deltachange/\<your pipeline name\>\<your pipeline run timestamp\>* folder path. The **Extraction mode** property of the copy activity is set to **Delta**. The **Subscriber process** property of copy activity is parameterized.
6262

6363
The Data Factory data flow activity runs on the Azure integration runtime to transform the raw data and merge all changes into Data Lake Storage Gen2 as an open source Delta Lake or Lakehouse table. The process replicates the SAP data.
6464

articles/data-factory/sap-change-data-capture-introduction-architecture.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -37,13 +37,13 @@ This article provides a high-level architecture of the SAP CDC solution in Azure
3737
- [Set up a linked service and source dataset](sap-change-data-capture-prepare-linked-service-source-dataset.md)
3838
- [Use the SAP data extraction template](sap-change-data-capture-data-replication-template.md)
3939
- [Use the SAP data partition template](sap-change-data-capture-data-partitioning-template.md)
40-
- [Manage the solution](sap-change-data-capture-management.md)
40+
- [Manage your solution](sap-change-data-capture-management.md)
4141

4242
## How to use the SAP CDC solution
4343

44-
The SAP CDC solution is a connector that you access through an SAP ODP (preview) linked service, an SAP ODP source dataset, and the SAP data replication template or the SAP data partitioning template. Choose your template when you set up a new pipeline in Azure Data Factory Studio. To access preview templates, you must [enable the preview experience in Azure Data Factory Studio](how-to-manage-studio-preview-exp.md#how-to-enabledisable-preview-experience).
44+
The SAP CDC solution is a connector that you access through an SAP ODP (preview) linked service, an SAP ODP (preview) source dataset, and the SAP data replication template or the SAP data partitioning template. Choose your template when you set up a new pipeline in Azure Data Factory Studio. To access preview templates, you must [enable the preview experience in Azure Data Factory Studio](how-to-manage-studio-preview-exp.md#how-to-enabledisable-preview-experience).
4545

46-
The SAP CDC solution connects to all SAP systems that support ODP, including SAP R/3, SAP ECC, SAP S/4HANA, SAP BW, and SAP BW/4HANA. The solution works either directly at the application layer or indirectly via an SAP Landscape Transformation Replication Server (SLT) as a proxy. Without relying on watermarking, it can extract SAP data either fully or incrementally. The data the SAP CDC solution extracts includes not only physical tables but also logical objects that are created by using the tables. An example of a table-based object is an SAP Advanced Business Application Programming (ABAP) Core Data Services (CDS) view.
46+
The SAP CDC solution connects to all SAP systems that support ODP, including SAP R/3, SAP ECC, SAP S/4HANA, SAP BW, and SAP BW/4HANA. The solution works either directly at the application layer or indirectly via an SAP Landscape Transformation Replication Server (SLT) as a proxy. The solution doesn't rely on watermarking to extract SAP data either fully or incrementally. The data the SAP CDC solution extracts includes not only physical tables but also logical objects that are created by using the tables. An example of a table-based object is an SAP Advanced Business Application Programming (ABAP) Core Data Services (CDS) view.
4747

4848
Use the SAP CDC solution with Data Factory features like copy activities and data flow activities, pipeline templates, and tumbling window triggers for a low-latency SAP CDC replication solution in a self-managed pipeline.
4949

@@ -53,7 +53,7 @@ The SAP CDC solution in Azure Data Factory is a connector between SAP and Azure.
5353

5454
The Azure side includes the Data Factory copy activity that loads the raw SAP data into a storage destination like Azure Blob Storage or Azure Data Lake Storage Gen2. The data is saved in CSV or Parquet format, essentially archiving or preserving all historical changes.
5555

56-
The Azure side also might include a Data Factory data flow activity that transforms the raw SAP data, merges all changes, and loads the results in a destination like Azure SQL Database or Azure Synapse Analytics, essentially replicating the SAP data. The Data Factory data flow activity also can load the results in Data Lake Storage Gen2 in delta format. You can use time travel capabilities to produce snapshots of SAP data at any specific period in the past.
56+
The Azure side also might include a Data Factory data flow activity that transforms the raw SAP data, merges all changes, and loads the results in a destination like Azure SQL Database or Azure Synapse Analytics, essentially replicating the SAP data. The Data Factory data flow activity also can load the results in Data Lake Storage Gen2 in delta format. You can use the open source Delta Lake Time Travel feature to produce snapshots of SAP data for a specific period.
5757

5858
In Azure Data Factory Studio, the SAP template that you use to auto-generate a Data Factory pipeline connects SAP with Azure. You can run the pipeline frequently by using a Data Factory tumbling window trigger to replicate SAP data in Azure with low latency and without using watermarking.
5959

articles/data-factory/sap-change-data-capture-management.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -76,19 +76,19 @@ In scenarios in which data movement works (copy activities finish without errors
7676

7777
To analyze what data the SAP system has provided for your scenario, start transaction ODQMON in your SAP back-end system. If you're using SAP Landscape Transformation Replication Server (SLT) with a standalone server, start the transaction there.
7878

79-
To find the ODQs that correspond to your copy activities or copy activity runs, use the filter options. In **Queue**, you can use wildcards to narrow the search. For example, you can search by the table name *EKKO*.
79+
To find the ODQs that correspond to your copy activities or copy activity runs, use the filter options. In **Queue**, you can use wildcards to narrow the search. For example, you can search by the table name **EKKO**.
8080

8181
Select the **Calculate Data Volume** checkbox to see details about the number of rows and data volume (in bytes) contained in the ODQs.
8282

83-
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-odqmon-troubleshooting-1.png" alt-text="Screenshot of the SAP ODQMON tool, with delta queues shown.":::
83+
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-odqmon-troubleshoot-queues.png" alt-text="Screenshot of the SAP ODQMON tool, with delta queues shown.":::
8484

8585
To view the ODQ subscriptions, double-click the queue. An ODQ can have multiple subscribers, so check for the subscriber name that you entered in the Data Factory linked service. Choose the subscription that has a timestamp that most closely matches the time your copy activity ran. For delta subscriptions, the first run of the copy activity for the subscription is recorded on the SAP side.
8686

87-
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-odqmon-troubleshooting-2.png" alt-text="Screenshot of the SAP ODQMON tool, with delta queue subscriptions shown.":::
87+
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-odqmon-troubleshoot-subscriptions.png" alt-text="Screenshot of the SAP ODQMON tool, with delta queue subscriptions shown.":::
8888

89-
In the subscription, a list of requests correspond to copy activity runs in Data Factory. In the following figure, you see the result of four copy activity runs:
89+
In the subscription, a list of requests corresponds to copy activity runs in Data Factory. In the following figure, you see the result of four copy activity runs:
9090

91-
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-odqmon-troubleshooting-3.png" alt-text="Screenshot of the SAP ODQMON tool with delta queue requests shown.":::
91+
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-odqmon-troubleshoot-requests.png" alt-text="Screenshot of the SAP ODQMON tool with delta queue requests shown.":::
9292

9393
Based on the timestamp in the first row, find the line that corresponds to the copy activity run you want to analyze. If the number of rows shown equals the number of rows read by the copy activity, you've verified that Data Factory has read and transferred the data as provided by the SAP system. In this scenario, we recommend that you consult with the team that's responsible for your SAP system.
9494

articles/data-factory/sap-change-data-capture-prepare-linked-service-source-dataset.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Set up a linked service and dataset for the SAP CDC solution (preview)
33
titleSuffix: Azure Data Factory
4-
description: Learn how to set up a linked service and source dataset to use with the SAP CDC solution (preview) in Azure Data Factory.
4+
description: Learn how to set up a linked service and source dataset to use with the SAP change data capture (CDC) solution (preview) in Azure Data Factory.
55
author: ukchrist
66
ms.service: data-factory
77
ms.subservice: data-movement
@@ -18,7 +18,7 @@ Learn how to set up the linked service and source dataset for your SAP change da
1818

1919
## Set up a linked service
2020

21-
To set up a linked service for your SAP CDC solution:
21+
To set up an SAP ODP (preview) linked service for your SAP CDC solution:
2222

2323
1. In Azure Data Factory Studio, go to the Manage hub of your data factory. In the menu under **Connections**, select **Linked services**. Select **New** to create a new linked service.
2424

@@ -43,7 +43,7 @@ To set up a linked service for your SAP CDC solution:
4343

4444
## Create a copy activity
4545

46-
To create a Data Factory copy activity that uses an SAP ODP data source, complete the steps in the following sections.
46+
To create a Data Factory copy activity that uses an SAP ODP (preview) data source, complete the steps in the following sections.
4747

4848
### Set up the source dataset
4949

@@ -57,7 +57,7 @@ To create a Data Factory copy activity that uses an SAP ODP data source, complet
5757

5858
1. In **New dataset**, search for **SAP**. Select **SAP ODP (Preview)**, and then select **Continue**.
5959

60-
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-source-dataset-selection.png" alt-text="Screenshot of the SAP ODP (Preview) dataset type on the New dataset dialog.":::
60+
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-source-dataset-selection.png" alt-text="Screenshot of the SAP ODP (Preview) dataset type in the New dataset dialog.":::
6161

6262
1. In **Set properties**, enter a name for the SAP ODP linked service data source. In **Linked service**, select the dropdown and select **New**.
6363

@@ -84,7 +84,7 @@ To create a Data Factory copy activity that uses an SAP ODP data source, complet
8484
1. In the Data Factory copy activity, in **Extraction mode**, select one of the following options:
8585

8686
- **Full**: Always extracts the current snapshot of the selected data source object. This option doesn't register the Data Factory copy activity as its delta subscriber that consumes data changes produced in the ODQ by your SAP system.
87-
- **Delta**: Initially extracts the current snapshot of the selected data source object. This option registers the Data Factory copy activity as its delta subscriber and subsequently extracts new data changes produced in the ODQ by your SAP system since the last extraction.
87+
- **Delta**: Initially extracts the current snapshot of the selected data source object. This option registers the Data Factory copy activity as its delta subscriber and then extracts new data changes produced in the ODQ by your SAP system since the last extraction.
8888
- **Recovery**: Repeats the last extraction that was part of a failed pipeline run.
8989

9090
1. In **Subscriber process**, enter a unique name to register and identify this Data Factory copy activity as a delta subscriber of the selected data source object. Your SAP system manages its subscription state to keep track of data changes that are produced in the ODQ and consumed in consecutive extractions. You don't need to manually watermark data changes. For example, you might name the subscriber process `<your pipeline name>_<your copy activity name>`.

0 commit comments

Comments
 (0)