Skip to content

Commit e7dd05b

Browse files
Merge pull request #235183 from dearandyxu/master
shared SHIR + staging
2 parents 99f6442 + adc7047 commit e7dd05b

File tree

2 files changed

+4
-1
lines changed

2 files changed

+4
-1
lines changed

articles/data-factory/sap-change-data-capture-introduction-architecture.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ The Azure side includes the Data Factory mapping data flow that can transform an
5252

5353
:::image type="content" source="media/sap-change-data-capture-solution/sap-cdc-architecture-diagram.png" border="false" alt-text="Diagram of the architecture of the SAP CDC solution.":::
5454

55-
To get started, create a Data Factory SAP CDC linked service, an SAP CDC source dataset, and a pipeline with a mapping data flow activity in which you use the SAP CDC source dataset. To extract the data from SAP, a self-hosted integration runtime is required that you install on an on-premises computer or on a virtual machine (VM). An on-premises computer has a line of sight to your SAP source systems and to your SLT server. The Data Factory data flow activity runs on a serverless Azure Databricks or Apache Spark cluster, or on an Azure integration runtime.
55+
To get started, create a Data Factory SAP CDC linked service, an SAP CDC source dataset, and a pipeline with a mapping data flow activity in which you use the SAP CDC source dataset. To extract the data from SAP, a self-hosted integration runtime is required that you install on an on-premises computer or on a virtual machine (VM). An on-premises computer has a line of sight to your SAP source systems and to your SLT server. The Data Factory data flow activity runs on a serverless Azure Databricks or Apache Spark cluster, or on an Azure integration runtime. A staging storage is required to be configured in data flow activity to make your self-hosted integration runtime work seamlessly with Data Flow integration runtime.
5656

5757
The SAP CDC connector uses the SAP ODP framework to extract various data source types, including:
5858

articles/data-factory/sap-change-data-capture-shir-preparation.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,9 @@ In Azure Data Factory Studio, [create and configure a self-hosted integration ru
2525

2626
The more CPU cores you have on the computer running the self-hosted integration runtime, the higher your data extraction throughput is. For example, an internal test achieved a higher than 12-MB/s throughput when running parallel extractions on a self-hosted integration runtime computer that has 16 CPU cores.
2727

28+
> [!NOTE]
29+
> If you want to use shared self hosted integration runtime from another Data Factory, you need to make sure your Data Factory is in the same region of another Data Factory. What is more, your Data Flow integration runtime need to be configured to "Auto Resolve" or the same region of your Data Factory.
30+
2831
## Download and install the SAP .NET connector
2932

3033
Download the latest [64-bit SAP .NET Connector (SAP NCo 3.0)](https://support.sap.com/en/product/connectors/msnet.html) and install it on the computer running the self-hosted integration runtime. During installation, in the **Optional setup steps** dialog, select **Install assemblies to GAC**, and then select **Next**.

0 commit comments

Comments
 (0)