You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/data-flow-troubleshoot-guide.md
+46-5Lines changed: 46 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
8
8
ms.subservice: data-flows
9
9
ms.custom: ignite-2022
10
10
ms.topic: troubleshooting
11
-
ms.date: 09/29/2022
11
+
ms.date: 11/02/2022
12
12
---
13
13
14
14
# Troubleshoot mapping data flows in Azure Data Factory
@@ -135,7 +135,7 @@ This section lists common error codes and messages reported by mapping data flow
135
135
136
136
-**Message**: Azure Cosmos DB throughput scale operation cannot be performed because another scale operation is in progress, please retry after sometime.
137
137
-**Cause**: The throughput scale operation of the Azure Cosmos DB can't be performed because another scale operation is in progress.
138
-
-**Recommendation**: Login to Azure Cosmos DB account, and manually change container throughput to be auto scale or add a custom activity after mapping data flows to reset the throughput.
138
+
-**Recommendation**: Log in to Azure Cosmos DB account, and manually change container throughput to be auto scale or add a custom activity after mapping data flows to reset the throughput.
139
139
140
140
### Error code: DF-Cosmos-IdPropertyMissed
141
141
@@ -194,6 +194,12 @@ This section lists common error codes and messages reported by mapping data flow
194
194
-**Cause**: The short data type is not supported in the Azure Cosmos DB instance.
195
195
-**Recommendation**: Add a derived column transformation to convert related columns from short to integer before using them in the Azure Cosmos DB sink transformation.
196
196
197
+
### Error code: DF-CSVWriter-InvalidQuoteSetting
198
+
199
+
-**Message**: Job failed while writing data with error: Quote character and escape character cannot be empty if column value contains column delimiter
200
+
-**Cause**: Both quote characters and escape characters are empty when the column value contains column delimiter.
201
+
-**Recommendation**: Set your quote character or escape character.
-**Message**: Read excel files with different schema is not supported now.
@@ -479,6 +496,21 @@ This section lists common error codes and messages reported by mapping data flow
479
496
-**Cause**: Possible problems with the JSON file: unsupported encoding, corrupt bytes, or using JSON source as a single document on many nested lines.
480
497
-**Recommendation**: Verify that the JSON file's encoding is supported. On the source transformation that's using a JSON dataset, expand **JSON Settings** and turn on **Single Document**.
-**Cause**: You are not permitted to access the storage account either due to missing roles for managed identity/service principal authentication or network firewall settings.
502
+
-**Recommendation**: When using managed identity/service principal authentication,
503
+
1. For source: In Storage Explorer, grant the managed identity/service principal at least **Execute** permission for ALL upstream folders and the file system, along with **Read** permission for the files to copy. Alternatively, in Access control (IAM), grant the managed identity/service principal at least the **Storage Blob Data Reader** role.
504
+
2. For sink: In Storage Explorer, grant the managed identity/service principal at least **Execute** permission for ALL upstream folders and the file system, along with **Write** permission for the sink folder. Alternatively, in Access control (IAM), grant the managed identity/service principal at least the **Storage Blob Data Contributor** role. <br>
505
+
506
+
Also please ensure that the network firewall settings in the storage account are configured correctly, as turning on firewall rules for your storage account blocks incoming requests for data by default, unless the requests originate from a service operating within an Azure Virtual Network (VNet) or from allowed public IP addresses.
-**Message**: System is not able to resolve the IP address of the host. Please verify that your host name is correct or check if your DNS server is able to resolve the host to an IP address successfully
511
+
-**Cause**: Unable to reach the given storage account.
512
+
-**Recommendation**: Check the name of the storage account and make sure the storage account exists.
513
+
482
514
### Error code: DF-File-InvalidSparkFolder
483
515
484
516
-**Message**: Failed to read footer for file.
@@ -602,8 +634,6 @@ This section lists common error codes and messages reported by mapping data flow
602
634
-**Cause**: SQL server configuration error.
603
635
-**Recommendations**: Install a trusted certificate on your SQL server, or change `encrypt` connection string setting to false and `trustServerCertificate` connection string setting to true.
604
636
605
-
606
-
607
637
### Error code: DF-PGSQL-InvalidCredential
608
638
609
639
-**Message**: User/password should be specified.
@@ -631,6 +661,17 @@ This section lists common error codes and messages reported by mapping data flow
631
661
| Your context value can't be empty when reading data. | Specify the context. |
632
662
| Your context value can't be empty when browsing object names. | Specify the context. |
633
663
664
+
### Error code: DF-SAPODP-DataflowSystemError
665
+
666
+
-**Recommendation**: Reconfigure the activity and run it again. If the issue persists, you can contact Microsoft support for further assistance.
667
+
668
+
### Error code: DF-SAPODP-DataParsingFailed
669
+
670
+
-**Cause**: Mostly you have hidden column settings in your SAP table. When you use SAP mapping data flow to read data from SAP server, it returns all the schema (columns, including hidden ones), but returned data do not contain related values. So, data misalignment happened and led to parse value issue or wrong data value issue.
671
+
-**Recommendation**: There are two recommendations for this issue:
672
+
1. Remove hidden settings from the related column(s) through SAP GUI.
673
+
2. If you want to keep existed SAP settings unchanged, use hidden feature (manually add DSL property `enableProjection:true` in script) in SAP mapping data flow to filter the hidden column(s) and continue to read data.
674
+
634
675
### Error code: DF-SAPODP-ObjectInvalid
635
676
636
677
-**Cause**: The object name is not found or not released.
@@ -751,7 +792,7 @@ This section lists common error codes and messages reported by mapping data flow
751
792
### Error code: DF-SAPODP-OOM
752
793
753
794
-**Message**: No more memory available to add rows to an internal table
754
-
-**Cause**: SAP Table connector has its limitation for big table extraction. SAP Table underlying relies on an RFC which will read all the data from the table into the memory of SAP system, so out of memory (OOM) issue will happen when we extracting big tables.
795
+
-**Cause**: SAP Table connector has its limitation for big table extraction. SAP Table underlying relies on an RFC which will read all the data from the table into the memory of SAP system, so out of memory (OOM) issue will happen when extracting big tables.
755
796
-**Recommendation**: Use SAP CDC connector to do full load directly from your source system, then move delta to SAP Landscape Transformation Replication Server (SLT) after init without delta is released.
0 commit comments