Skip to content

Commit 61cd644

Browse files
authored
Merge pull request #97162 from gu111gu/master
Update connector-troubleshoot-guide.md
2 parents 8aacf4e + 7d6abb9 commit 61cd644

File tree

1 file changed

+246
-83
lines changed

1 file changed

+246
-83
lines changed

articles/data-factory/connector-troubleshoot-guide.md

Lines changed: 246 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ services: data-factory
55
author: linda33wj
66
ms.service: data-factory
77
ms.topic: troubleshooting
8-
ms.date: 08/26/2019
8+
ms.date: 11/26/2019
99
ms.author: jingwang
1010
ms.reviewer: craigg
1111
---
@@ -43,84 +43,6 @@ busy to handle requests, it returns an HTTP error 503.
4343
4444
- **Resolution**: Rerun the copy activity after several minutes.
4545
46-
## Azure SQL Data Warehouse
47-
48-
### Error message: Conversion failed when converting from a character string to uniqueidentifier
49-
50-
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into Azure SQL Data Warehouse using staged copy and PolyBase, you hit the following error:
51-
52-
```
53-
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
54-
Message=Error happened when loading data into SQL Data Warehouse.,
55-
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
56-
Message=Conversion failed when converting from a character string to uniqueidentifier...
57-
```
58-
59-
- **Cause**: Azure SQL Data Warehouse PolyBase cannot convert empty string to GUID.
60-
61-
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
62-
63-
### Error message: Expected data type: DECIMAL(x,x), Offending value
64-
65-
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into SQL DW using staged copy and PolyBase, you hit the following error:
66-
67-
```
68-
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
69-
Message=Error happened when loading data into SQL Data Warehouse.,
70-
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
71-
Message=Query aborted-- the maximum reject threshold (0 rows) was reached while reading from an external source: 1 rows rejected out of total 415 rows processed. (/file_name.txt)
72-
Column ordinal: 18, Expected data type: DECIMAL(x,x), Offending value:..
73-
```
74-
75-
- **Cause**: Azure SQL Data Warehouse Polybase cannot insert empty string (null value) into decimal column.
76-
77-
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
78-
79-
### Error message: Java exception message:HdfsBridge::CreateRecordReader
80-
81-
- **Symptoms**: You copy data into Azure SQL Data Warehouse using PolyBase, and hit the following error:
82-
83-
```
84-
Message=110802;An internal DMS error occurred that caused this operation to fail.
85-
Details: Exception: Microsoft.SqlServer.DataWarehouse.DataMovement.Common.ExternalAccess.HdfsAccessException,
86-
Message: Java exception raised on call to HdfsBridge_CreateRecordReader.
87-
Java exception message:HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.: Error [HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.] occurred while accessing external file.....
88-
```
89-
90-
- **Cause**: The possible cause is that the schema (total column width) being too large (larger than 1 MB). Check the schema of the target SQL DW table by adding the size of all columns:
91-
92-
- Int -> 4 bytes
93-
- Bigint -> 8 bytes
94-
- Varchar(n),char(n),binary(n), varbinary(n) -> n bytes
95-
- Nvarchar(n), nchar(n) -> n*2 bytes
96-
- Date -> 6 bytes
97-
- Datetime/(2), smalldatetime -> 16 bytes
98-
- Datetimeoffset -> 20 bytes
99-
- Decimal -> 19 bytes
100-
- Float -> 8 bytes
101-
- Money -> 8 bytes
102-
- Smallmoney -> 4 bytes
103-
- Real -> 4 bytes
104-
- Smallint -> 2 bytes
105-
- Time -> 12 bytes
106-
- Tinyint -> 1 byte
107-
108-
- **Resolution**: Reduce column width to be less than 1 MB
109-
110-
- Or use bulk insert approach by disabling Polybase
111-
112-
### Error message: The condition specified using HTTP conditional header(s) is not met
113-
114-
- **Symptoms**: You use SQL query to pull data from Azure SQL Data Warehouse and hit the following error:
115-
116-
```
117-
...StorageException: The condition specified using HTTP conditional header(s) is not met...
118-
```
119-
120-
- **Cause**: Azure SQL Data Warehouse hit issue querying the external table in Azure Storage.
121-
122-
- **Resolution**: Run the same query in SSMS and check if you see the same result. If yes, open a support ticket to Azure SQL Data Warehouse and provide your SQL DW server and database name to further troubleshoot.
123-
12446
## Azure Cosmos DB
12547
12648
### Error message: Request size is too large
@@ -137,7 +59,7 @@ busy to handle requests, it returns an HTTP error 503.
13759
13860
```
13961
Message=Partition range id 0 | Failed to import mini-batch.
140-
Exception was Message: {"Errors":["Encountered exception while executing function. Exception = Error: {\"Errors\":[\"Unique index constraint violation.\"]}...
62+
Exception was Message: {"Errors":["Encountered exception while executing function. Exception = Error: {\"Errors\":[\"Unique index constraint violation.\"]}...
14163
```
14264
14365
- **Cause**: There are two possible causes:
@@ -235,6 +157,249 @@ Cosmos DB calculates RU from [here](../cosmos-db/request-units.md#request-unit-c
235157
```
236158
237159
- For cause #3, double check if the key file or password is correct using other tools to validate if you can use it to access the SFTP server properly.
160+
161+
162+
## Azure SQL Data Warehouse \ Azure SQL Database \ SQL Server
163+
164+
### Error code: SqlFailedToConnect
165+
166+
- **Message**: `Cannot connect to SQL database: '%server;', Database: '%database;', User: '%user;'. Please check the linked service configuration is correct, and make sure the SQL database firewall allows the integration runtime to access.`
167+
168+
- **Cause**: If the error message contains "SqlException", SQL database throws the error indicating some specific operation failed.
169+
170+
- **Recommendation**: Please search by SQL error code in this reference doc for more details: https://docs.microsoft.com/sql/relational-databases/errors-events/database-engine-events-and-errors. If you need further help, please contact Azure SQL support.
171+
172+
- **Cause**: If the error message contains "Client with IP address '...' is not allowed to access the server", and you are trying to connect to Azure SQL database, usually it is caused by Azure SQL database firewall issue.
173+
174+
- **Recommendation**: In Azure SQL Server firewall configuration, enable "Allow Azure services and resources to access this server" option. Reference doc: https://docs.microsoft.com/azure/sql-database/sql-database-firewall-configure.
175+
176+
177+
### Error code: SqlOperationFailed
178+
179+
- **Message**: `A database operation failed. Please search error to get more details.`
180+
181+
- **Cause**: If the error message contains "SqlException", SQL database throws the error indicating some specific operation failed.
182+
183+
- **Recommendation**: Please search by SQL error code in this reference doc for more details: https://docs.microsoft.com/sql/relational-databases/errors-events/database-engine-events-and-errors. If you need further help, please contact Azure SQL support.
184+
185+
- **Cause**: If the error message contains "PdwManagedToNativeInteropException", usually it's caused by mismatch between source and sink column sizes.
186+
187+
- **Recommendation**: Please check the size of both source and sink columns. If you need further help, please contact Azure SQL support.
188+
189+
- **Cause**: If the error message contains "InvalidOperationException", usually it's caused by invalid input data.
190+
191+
- **Recommendation**: To identify which row encounters the problem, please enable fault tolerance feature on copy activity, which can redirect problematic row(s) to a storage for further investigation. Reference doc: https://docs.microsoft.com/azure/data-factory/copy-activity-fault-tolerance.
192+
193+
194+
### Error code: SqlUnauthorizedAccess
195+
196+
- **Message**: `Cannot connect to '%connectorName;'. Detail Message: '%message;'`
197+
198+
- **Cause**: Credential is incorrect or the login account cannot access SQL database.
199+
200+
- **Recommendation**: Please check the login account has enough permission to access the SQL database.
201+
202+
203+
### Error code: SqlOpenConnectionTimeout
204+
205+
- **Message**: `Open connection to database timeout after '%timeoutValue;' seconds.`
206+
207+
- **Cause**: Could be SQL database transient failure.
208+
209+
- **Recommendation**: Please retry to update linked service connection string with larger connection timeout value.
210+
211+
212+
### Error code: SqlAutoCreateTableTypeMapFailed
213+
214+
- **Message**: `Type '%dataType;' in source side cannot be mapped to a type that supported by sink side(colunm name:'%colunmName;') in auto-create table.`
215+
216+
- **Cause**: Auto creation table cannot meet source requirement.
217+
218+
- **Recommendation**: Please update the column type in 'mappings', or manually create the sink table in target server.
219+
220+
221+
### Error code: SqlDataTypeNotSupported
222+
223+
- **Message**: `A database operation failed. Please check the SQL errors.`
224+
225+
- **Cause**: If the issue happens on SQL source and the error is related to SqlDateTime overflow, the data value is over the logic type range (1/1/1753 12:00:00 AM - 12/31/9999 11:59:59 PM).
226+
227+
- **Recommendation**: Please cast the type to string in source SQL query, or in copy activity column mapping change the column type to 'String'.
228+
229+
- **Cause**: If the issue happens on SQL sink and the error is related to SqlDateTime overflow, the data value is over the allowed range in sink table.
230+
231+
- **Recommendation**: Please update the corresponding column type to 'datetime2' type in sink table.
232+
233+
234+
### Error code: SqlInvalidDbStoredProcedure
235+
236+
- **Message**: `The specified Stored Procedure is not valid. It could be caused by that the stored procedure doesn't return any data. Invalid Stored Procedure script: '%scriptName;'.`
237+
238+
- **Cause**: The specified Stored Procedure is not valid. It could be caused by that the stored procedure doesn't return any data.
239+
240+
- **Recommendation**: Please validate the stored procedure by SQL Tools. Make sure the stored procedure can return data.
241+
242+
243+
### Error code: SqlInvalidDbQueryString
244+
245+
- **Message**: `The specified SQL Query is not valid. It could be caused by that the query doesn't return any data. Invalid query: '%query;'`
246+
247+
- **Cause**: The specified SQL Query is not valid. It could be caused by that the query doesn't return any data
248+
249+
- **Recommendation**: Please validate the SQL Query by SQL Tools. Make sure the query can return data.
250+
251+
252+
### Error code: SqlInvalidColumnName
253+
254+
- **Message**: `Column '%column;' does not exist in the table '%tableName;', ServerName: '%serverName;', DatabaseName: '%dbName;'.`
255+
256+
- **Cause**: Cannot find column. Possible configuration wrong.
257+
258+
- **Recommendation**: Please verify column in the query, 'structure' in dataset, and 'mappings' in activity.
259+
260+
261+
### Error code: SqlBatchWriteTimeout
262+
263+
- **Message**: `Timeout in SQL write opertaion.`
264+
265+
- **Cause**: Could be SQL database transient failure.
266+
267+
- **Recommendation**: If problem repro, please contact Azure SQL support.
268+
269+
270+
### Error code: SqlBatchWriteRollbackFailed
271+
272+
- **Message**: `Timeout in SQL write operation and rollback also fail.`
273+
274+
- **Cause**: Could be SQL database transient failure.
275+
276+
- **Recommendation**: Please retry to update linked service connection string with larger connection timeout value.
277+
278+
279+
### Error code: SqlBulkCopyInvalidColumnLength
280+
281+
- **Message**: `SQL Bulk Copy failed due to received an invalid column length from the bcp client.`
282+
283+
- **Cause**: SQL Bulk Copy failed due to received an invalid column length from the bcp client.
284+
285+
- **Recommendation**: To identify which row encounters the problem, please enable fault tolerance feature on copy activity, which can redirect problematic row(s) to a storage for further investigation. Reference doc: https://docs.microsoft.com/azure/data-factory/copy-activity-fault-tolerance.
286+
287+
288+
### Error code: SqlConnectionIsClosed
289+
290+
- **Message**: `The connection is closed by SQL database.`
291+
292+
- **Cause**: SQL connection is closed by SQL database when high concurrent run and sever terminate connection.
293+
294+
- **Recommendation**: Remote server close the SQL connection. Please retry. If problem repro, please contact Azure SQL support.
295+
296+
### Error message: Conversion failed when converting from a character string to uniqueidentifier
297+
298+
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into Azure SQL Data Warehouse using staged copy and PolyBase, you hit the following error:
299+
300+
```
301+
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
302+
Message=Error happened when loading data into SQL Data Warehouse.,
303+
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
304+
Message=Conversion failed when converting from a character string to uniqueidentifier...
305+
```
306+
307+
- **Cause**: Azure SQL Data Warehouse PolyBase cannot convert empty string to GUID.
308+
309+
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
310+
311+
### Error message: Expected data type: DECIMAL(x,x), Offending value
312+
313+
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into SQL DW using staged copy and PolyBase, you hit the following error:
314+
315+
```
316+
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
317+
Message=Error happened when loading data into SQL Data Warehouse.,
318+
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
319+
Message=Query aborted-- the maximum reject threshold (0 rows) was reached while reading from an external source: 1 rows rejected out of total 415 rows processed. (/file_name.txt)
320+
Column ordinal: 18, Expected data type: DECIMAL(x,x), Offending value:..
321+
```
322+
323+
- **Cause**: Azure SQL Data Warehouse Polybase cannot insert empty string (null value) into decimal column.
324+
325+
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
326+
327+
### Error message: Java exception message:HdfsBridge::CreateRecordReader
328+
329+
- **Symptoms**: You copy data into Azure SQL Data Warehouse using PolyBase, and hit the following error:
330+
331+
```
332+
Message=110802;An internal DMS error occurred that caused this operation to fail.
333+
Details: Exception: Microsoft.SqlServer.DataWarehouse.DataMovement.Common.ExternalAccess.HdfsAccessException,
334+
Message: Java exception raised on call to HdfsBridge_CreateRecordReader.
335+
Java exception message:HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.: Error [HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.] occurred while accessing external file.....
336+
```
337+
338+
- **Cause**: The possible cause is that the schema (total column width) being too large (larger than 1 MB). Check the schema of the target SQL DW table by adding the size of all columns:
339+
340+
- Int -> 4 bytes
341+
- Bigint -> 8 bytes
342+
- Varchar(n),char(n),binary(n), varbinary(n) -> n bytes
343+
- Nvarchar(n), nchar(n) -> n*2 bytes
344+
- Date -> 6 bytes
345+
- Datetime/(2), smalldatetime -> 16 bytes
346+
- Datetimeoffset -> 20 bytes
347+
- Decimal -> 19 bytes
348+
- Float -> 8 bytes
349+
- Money -> 8 bytes
350+
- Smallmoney -> 4 bytes
351+
- Real -> 4 bytes
352+
- Smallint -> 2 bytes
353+
- Time -> 12 bytes
354+
- Tinyint -> 1 byte
355+
356+
- **Resolution**: Reduce column width to be less than 1 MB
357+
358+
- Or use bulk insert approach by disabling Polybase
359+
360+
### Error message: The condition specified using HTTP conditional header(s) is not met
361+
362+
- **Symptoms**: You use SQL query to pull data from Azure SQL Data Warehouse and hit the following error:
363+
364+
```
365+
...StorageException: The condition specified using HTTP conditional header(s) is not met...
366+
```
367+
368+
- **Cause**: Azure SQL Data Warehouse hit issue querying the external table in Azure Storage.
369+
370+
- **Resolution**: Run the same query in SSMS and check if you see the same result. If yes, open a support ticket to Azure SQL Data Warehouse and provide your SQL DW server and database name to further troubleshoot.
371+
372+
373+
## Azure Blob Storage
374+
375+
### Error code: AzureBlobOperationFailed
376+
377+
- **Message**: `Blob operation Failed. ContainerName: %containerName;, path: %path;.`
378+
379+
- **Cause**: Blob storage operation hit problem.
380+
381+
- **Recommendation**: Please check the error in details. Please refer to blob help document: https://docs.microsoft.com/rest/api/storageservices/blob-service-error-codes. Contact storage team if need help.
382+
383+
384+
385+
## Azure Data Lake Gen2
386+
387+
### Error code: AdlsGen2OperationFailed
388+
389+
- **Message**: `ADLS Gen2 operation failed for: %adlsGen2Message;.%exceptionData;.`
390+
391+
- **Cause**: ADLS Gen2 throws the error indicating operation failed.
392+
393+
- **Recommendation**: Please check the detailed error message thrown by ADLS Gen2. If it's caused by transient failure, please retry. If you need further help, please contact Azure Storage support and provide the request ID in error message.
394+
395+
- **Cause**: When the error message contains 'Forbidden', the service principal or managed identity you use may not have enough permission to access the ADLS Gen2.
396+
397+
- **Recommendation**: Please refer to the help document: https://docs.microsoft.com/azure/data-factory/connector-azure-data-lake-storage#service-principal-authentication.
398+
399+
- **Cause**: When the error message contains 'InternalServerError', the error is returned by ADLS Gen2.
400+
401+
- **Recommendation**: It may be caused by transient failure, please retry. If the issue persists, please contact Azure Storage support and provide the request ID in error message.
402+
238403
239404
## Next steps
240405
@@ -246,6 +411,4 @@ For more troubleshooting help, try these resources:
246411
* [MSDN forum](https://social.msdn.microsoft.com/Forums/home?sort=relevancedesc&brandIgnore=True&searchTerm=data+factory)
247412
* [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)
248413
* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
249-
250-
251-
414+

0 commit comments

Comments
 (0)