You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/data-flow-troubleshoot-guide.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -135,7 +135,7 @@ This section lists common error codes and messages reported by mapping data flow
135
135
136
136
-**Message**: Azure Cosmos DB throughput scale operation cannot be performed because another scale operation is in progress, please retry after sometime.
137
137
-**Cause**: The throughput scale operation of the Azure Cosmos DB can't be performed because another scale operation is in progress.
138
-
-**Recommendation**: Login to Azure Cosmos DB account, and manually change container throughput to be auto scale or add a custom activity after mapping data flows to reset the throughput.
138
+
-**Recommendation**: Log in to Azure Cosmos DB account, and manually change container throughput to be auto scale or add a custom activity after mapping data flows to reset the throughput.
139
139
140
140
### Error code: DF-Cosmos-IdPropertyMissed
141
141
@@ -667,7 +667,7 @@ This section lists common error codes and messages reported by mapping data flow
667
667
668
668
### Error code: DF-SAPODP-DataParsingFailed
669
669
670
-
-**Cause**: Mostly you have hidden column settings in your SAP table. When you use SAP mapping data flow to read data from SAP server, it returns all the schema (columns, including hidden ones), but returned data do not contain related values. So, data misalignment happened and lead to parse value issue or wrong data value issue.
670
+
-**Cause**: Mostly you have hidden column settings in your SAP table. When you use SAP mapping data flow to read data from SAP server, it returns all the schema (columns, including hidden ones), but returned data do not contain related values. So, data misalignment happened and led to parse value issue or wrong data value issue.
671
671
-**Recommendation**: There are two recommendations for this issue:
672
672
1. Remove hidden settings from the related column(s) through SAP GUI.
673
673
2. If you want to keep existed SAP settings unchanged, use hidden feature (manually add DSL property `enableProjection:true` in script) in SAP mapping data flow to filter the hidden column(s) and continue to read data.
@@ -792,7 +792,7 @@ This section lists common error codes and messages reported by mapping data flow
792
792
### Error code: DF-SAPODP-OOM
793
793
794
794
-**Message**: No more memory available to add rows to an internal table
795
-
-**Cause**: SAP Table connector has its limitation for big table extraction. SAP Table underlying relies on an RFC which will read all the data from the table into the memory of SAP system, so out of memory (OOM) issue will happen when we extracting big tables.
795
+
-**Cause**: SAP Table connector has its limitation for big table extraction. SAP Table underlying relies on an RFC which will read all the data from the table into the memory of SAP system, so out of memory (OOM) issue will happen when extracting big tables.
796
796
-**Recommendation**: Use SAP CDC connector to do full load directly from your source system, then move delta to SAP Landscape Transformation Replication Server (SLT) after init without delta is released.
0 commit comments