Skip to content

Commit a6136e5

Browse files
committed
update SAP template1
1 parent d28c885 commit a6136e5

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

articles/data-factory/solution-template-replicate-multiple-objects-sap-cdc.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -25,12 +25,12 @@ The template contains three activities:
2525
- **Mapping dataflow** replicates each SAP ODP object from SAP system to Azure Data Lake Gen2 in Delta format. It will do initial full load in the first run and then do incremental load in the subsequent runs automatically. It will merge the changes to Azure Data Lake Gen2 in Delta format.
2626

2727
An external control file in json format is required in this template. The schema for the control file is as below.
28-
- *checkPointKey* is a custom key to manage the checkpoint of your changed data capture in ADF. You can get more details [here](concepts-change-data-capture.md#checkpoint).
28+
- *checkPointKey* is your custom key to manage the checkpoint of your changed data capture in ADF. You can get more details [here](concepts-change-data-capture.md#checkpoint).
2929
- *sapContext* is your SAP ODP context from the source SAP system. You can get more details [here](sap-change-data-capture-prepare-linked-service-source-dataset.md#set-up-the-source-dataset).
3030
- *sapObjectName* is your SAP ODP object name to be loaded from the SAP system. You can get more details [here](sap-change-data-capture-prepare-linked-service-source-dataset.md#set-up-the-source-dataset).
3131
- *sapRunMode* is to determine how you want to load SAP object. It can be fullLoad, incrementalLoad or fullAndIncrementalLoad.
3232
- *sapKeyColumns* are your key column names from SAP ODP objects used to do the dedupe in mapping dataflow.
33-
- *sapPartitions* are list of partition condition leading to separate extraction processes in the connected SAP system.
33+
- *sapPartitions* are list of partition conditions leading to separate extraction processes in the connected SAP system.
3434
- *deltaContainer* is your container name in the Azure Data Lake Gen2 as the destination store.
3535
- *deltaFolder* is your folder name in the Azure Data Lake Gen2 as the destination store.
3636
- *deltaKeyColumns* are your columns used to determine if a row from the source matches a row from the sink when you want to update or delete a row.
@@ -87,7 +87,7 @@ A sample control file is as below:
8787

8888
## How to use this solution template
8989

90-
1. Create and upload a control file into json format to your Azure Data Lake Gen2 as the destination store. The default container to store the control file is **delta** and default control file name is **SapToDeltaParameters.json**.
90+
1. Create and upload a control file into json format to your Azure Data Lake Gen2 as the destination store. The default container to store the control file is **demo** and default control file name is **SapToDeltaParameters.json**.
9191

9292

9393
2. Go to the **Replicate multiple tables from SAP ODP to Azure Data Lake Storage Gen2 in Delta format** template and **click** it.

0 commit comments

Comments
 (0)