Skip to content

Commit 976ee5c

Browse files
committed
Update connector-troubleshoot-guide.md
Provide more suggestions for azure data factory connectors, include Azure SQL Data Warehouse, Azure SQL Database, SQL Server, Azure Data Lake Gen2, Azure Blob Storage
1 parent 8a3e8e9 commit 976ee5c

File tree

1 file changed

+247
-83
lines changed

1 file changed

+247
-83
lines changed

articles/data-factory/connector-troubleshoot-guide.md

Lines changed: 247 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
1+
12
---
23
title: Troubleshoot Azure Data Factory Connectors
34
description: Learn how to troubleshoot connector issues in Azure Data Factory.
45
services: data-factory
56
author: linda33wj
67
ms.service: data-factory
78
ms.topic: troubleshooting
8-
ms.date: 08/26/2019
9+
ms.date: 11/25/2019
910
ms.author: jingwang
1011
ms.reviewer: craigg
1112
---
@@ -43,84 +44,6 @@ busy to handle requests, it returns an HTTP error 503.
4344
4445
- **Resolution**: Rerun the copy activity after several minutes.
4546
46-
## Azure SQL Data Warehouse
47-
48-
### Error message: Conversion failed when converting from a character string to uniqueidentifier
49-
50-
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into Azure SQL Data Warehouse using staged copy and PolyBase, you hit the following error:
51-
52-
```
53-
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
54-
Message=Error happened when loading data into SQL Data Warehouse.,
55-
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
56-
Message=Conversion failed when converting from a character string to uniqueidentifier...
57-
```
58-
59-
- **Cause**: Azure SQL Data Warehouse PolyBase cannot convert empty string to GUID.
60-
61-
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
62-
63-
### Error message: Expected data type: DECIMAL(x,x), Offending value
64-
65-
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into SQL DW using staged copy and PolyBase, you hit the following error:
66-
67-
```
68-
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
69-
Message=Error happened when loading data into SQL Data Warehouse.,
70-
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
71-
Message=Query aborted-- the maximum reject threshold (0 rows) was reached while reading from an external source: 1 rows rejected out of total 415 rows processed. (/file_name.txt)
72-
Column ordinal: 18, Expected data type: DECIMAL(x,x), Offending value:..
73-
```
74-
75-
- **Cause**: Azure SQL Data Warehouse Polybase cannot insert empty string (null value) into decimal column.
76-
77-
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
78-
79-
### Error message: Java exception message:HdfsBridge::CreateRecordReader
80-
81-
- **Symptoms**: You copy data into Azure SQL Data Warehouse using PolyBase, and hit the following error:
82-
83-
```
84-
Message=110802;An internal DMS error occurred that caused this operation to fail.
85-
Details: Exception: Microsoft.SqlServer.DataWarehouse.DataMovement.Common.ExternalAccess.HdfsAccessException,
86-
Message: Java exception raised on call to HdfsBridge_CreateRecordReader.
87-
Java exception message:HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.: Error [HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.] occurred while accessing external file.....
88-
```
89-
90-
- **Cause**: The possible cause is that the schema (total column width) being too large (larger than 1 MB). Check the schema of the target SQL DW table by adding the size of all columns:
91-
92-
- Int -> 4 bytes
93-
- Bigint -> 8 bytes
94-
- Varchar(n),char(n),binary(n), varbinary(n) -> n bytes
95-
- Nvarchar(n), nchar(n) -> n*2 bytes
96-
- Date -> 6 bytes
97-
- Datetime/(2), smalldatetime -> 16 bytes
98-
- Datetimeoffset -> 20 bytes
99-
- Decimal -> 19 bytes
100-
- Float -> 8 bytes
101-
- Money -> 8 bytes
102-
- Smallmoney -> 4 bytes
103-
- Real -> 4 bytes
104-
- Smallint -> 2 bytes
105-
- Time -> 12 bytes
106-
- Tinyint -> 1 byte
107-
108-
- **Resolution**: Reduce column width to be less than 1 MB
109-
110-
- Or use bulk insert approach by disabling Polybase
111-
112-
### Error message: The condition specified using HTTP conditional header(s) is not met
113-
114-
- **Symptoms**: You use SQL query to pull data from Azure SQL Data Warehouse and hit the following error:
115-
116-
```
117-
...StorageException: The condition specified using HTTP conditional header(s) is not met...
118-
```
119-
120-
- **Cause**: Azure SQL Data Warehouse hit issue querying the external table in Azure Storage.
121-
122-
- **Resolution**: Run the same query in SSMS and check if you see the same result. If yes, open a support ticket to Azure SQL Data Warehouse and provide your SQL DW server and database name to further troubleshoot.
123-
12447
## Azure Cosmos DB
12548
12649
### Error message: Request size is too large
@@ -137,7 +60,7 @@ busy to handle requests, it returns an HTTP error 503.
13760
13861
```
13962
Message=Partition range id 0 | Failed to import mini-batch.
140-
Exception was Message: {"Errors":["Encountered exception while executing function. Exception = Error: {\"Errors\":[\"Unique index constraint violation.\"]}...
63+
Exception was Message: {"Errors":["Encountered exception while executing function. Exception = Error: {\"Errors\":[\"Unique index constraint violation.\"]}...
14164
```
14265
14366
- **Cause**: There are two possible causes:
@@ -235,6 +158,249 @@ Cosmos DB calculates RU from [here](../cosmos-db/request-units.md#request-unit-c
235158
```
236159
237160
- For cause #3, double check if the key file or password is correct using other tools to validate if you can use it to access the SFTP server properly.
161+
162+
163+
## Azure SQL Data Warehouse \ Azure SQL Database \ SQL Server
164+
165+
### Error code: SqlFailedToConnect
166+
167+
- **Message**: `Cannot connect to SQL database: '%server;', Database: '%database;', User: '%user;'. Please check the linked service configuration is correct, and make sure the SQL database firewall allows the integration runtime to access.`
168+
169+
- **Cause**: If the error message contains "SqlException", SQL database throws the error indicating some specific operation failed.
170+
171+
- **Recommendation**: Please search by SQL error code in this reference doc for more details: https://docs.microsoft.com/en-us/sql/relational-databases/errors-events/database-engine-events-and-errors. If you need further help, please contact Azure SQL support.
172+
173+
- **Cause**: If the error message contains "Client with IP address '...' is not allowed to access the server", and you are trying to connect to Azure SQL database, usually it is caused by Azure SQL database firewall issue.
174+
175+
- **Recommendation**: In Azure SQL Server firewall configuration, enable "Allow Azure services and resources to access this server" option. Reference doc: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-firewall-configure.
176+
177+
178+
### Error code: SqlOperationFailed
179+
180+
- **Message**: `A database operation failed. Please search error to get more details.`
181+
182+
- **Cause**: If the error message contains "SqlException", SQL database throws the error indicating some specific operation failed.
183+
184+
- **Recommendation**: Please search by SQL error code in this reference doc for more details: https://docs.microsoft.com/en-us/sql/relational-databases/errors-events/database-engine-events-and-errors. If you need further help, please contact Azure SQL support.
185+
186+
- **Cause**: If the error message contains "PdwManagedToNativeInteropException", usually it's caused by mismatch between source and sink column sizes.
187+
188+
- **Recommendation**: Please check the size of both source and sink columns. If you need further help, please contact Azure SQL support.
189+
190+
- **Cause**: If the error message contains "InvalidOperationException", usually it's caused by invalid input data.
191+
192+
- **Recommendation**: To identify which row encounters the problem, please enable fault tolerance feature on copy activity, which can redirect problematic row(s) to a storage for further investigation. Reference doc: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-fault-tolerance.
193+
194+
195+
### Error code: SqlUnauthorizedAccess
196+
197+
- **Message**: `Cannot connect to '%connectorName;'. Detail Message: '%message;'`
198+
199+
- **Cause**: Credential is incorrect or the login account cannot access SQL database.
200+
201+
- **Recommendation**: Please check the login account has enough permission to access the SQL database.
202+
203+
204+
### Error code: SqlOpenConnectionTimeout
205+
206+
- **Message**: `Open connection to database timeout after '%timeoutValue;' seconds.`
207+
208+
- **Cause**: Could be SQL database transient failure.
209+
210+
- **Recommendation**: Please retry to update linked service connection string with larger connection timeout value.
211+
212+
213+
### Error code: SqlAutoCreateTableTypeMapFailed
214+
215+
- **Message**: `Type '%dataType;' in source side cannot be mapped to a type that supported by sink side(colunm name:'%colunmName;') in auto-create table.`
216+
217+
- **Cause**: Auto creation table cannot meet source requirement.
218+
219+
- **Recommendation**: Please update the column type in 'mappings', or manually create the sink table in target server.
220+
221+
222+
### Error code: SqlDataTypeNotSupported
223+
224+
- **Message**: `A database operation failed. Please check the SQL errors.`
225+
226+
- **Cause**: If the issue happens on SQL source and the error is related to SqlDateTime overflow, the data value is over the logic type range (1/1/1753 12:00:00 AM - 12/31/9999 11:59:59 PM).
227+
228+
- **Recommendation**: Please cast the type to string in source SQL query, or in copy activity column mapping change the column type to 'String'.
229+
230+
- **Cause**: If the issue happens on SQL sink and the error is related to SqlDateTime overflow, the data value is over the allowed range in sink table.
231+
232+
- **Recommendation**: Please update the corresponding column type to 'datetime2' type in sink table.
233+
234+
235+
### Error code: SqlInvalidDbStoredProcedure
236+
237+
- **Message**: `The specified Stored Procedure is not valid. It could be caused by that the stored procedure doesn't return any data. Invalid Stored Procedure script: '%scriptName;'.`
238+
239+
- **Cause**: The specified Stored Procedure is not valid. It could be caused by that the stored procedure doesn't return any data.
240+
241+
- **Recommendation**: Please validate the stored procedure by SQL Tools. Make sure the stored procedure can return data.
242+
243+
244+
### Error code: SqlInvalidDbQueryString
245+
246+
- **Message**: `The specified SQL Query is not valid. It could be caused by that the query doesn't return any data. Invalid query: '%query;'`
247+
248+
- **Cause**: The specified SQL Query is not valid. It could be caused by that the query doesn't return any data
249+
250+
- **Recommendation**: Please validate the SQL Query by SQL Tools. Make sure the query can return data.
251+
252+
253+
### Error code: SqlInvalidColumnName
254+
255+
- **Message**: `Column '%column;' does not exist in the table '%tableName;', ServerName: '%serverName;', DatabaseName: '%dbName;'.`
256+
257+
- **Cause**: Cannot find column. Possible configuration wrong.
258+
259+
- **Recommendation**: Please verify column in the query, 'structure' in dataset, and 'mappings' in activity.
260+
261+
262+
### Error code: SqlBatchWriteTimeout
263+
264+
- **Message**: `Timeout in SQL write opertaion.`
265+
266+
- **Cause**: Could be SQL database transient failure.
267+
268+
- **Recommendation**: If problem repro, please contact Azure SQL support.
269+
270+
271+
### Error code: SqlBatchWriteRollbackFailed
272+
273+
- **Message**: `Timeout in SQL write operation and rollback also fail.`
274+
275+
- **Cause**: Could be SQL database transient failure.
276+
277+
- **Recommendation**: Please retry to update linked service connection string with larger connection timeout value.
278+
279+
280+
### Error code: SqlBulkCopyInvalidColumnLength
281+
282+
- **Message**: `SQL Bulk Copy failed due to received an invalid column length from the bcp client.`
283+
284+
- **Cause**: SQL Bulk Copy failed due to received an invalid column length from the bcp client.
285+
286+
- **Recommendation**: To identify which row encounters the problem, please enable fault tolerance feature on copy activity, which can redirect problematic row(s) to a storage for further investigation. Reference doc: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-fault-tolerance.
287+
288+
289+
### Error code: SqlConnectionIsClosed
290+
291+
- **Message**: `The connection is closed by SQL database.`
292+
293+
- **Cause**: SQL connection is closed by SQL database when high concurrent run and sever terminate connection.
294+
295+
- **Recommendation**: Remote server close the SQL connection. Please retry. If problem repro, please contact Azure SQL support.
296+
297+
### Error message: Conversion failed when converting from a character string to uniqueidentifier
298+
299+
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into Azure SQL Data Warehouse using staged copy and PolyBase, you hit the following error:
300+
301+
```
302+
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
303+
Message=Error happened when loading data into SQL Data Warehouse.,
304+
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
305+
Message=Conversion failed when converting from a character string to uniqueidentifier...
306+
```
307+
308+
- **Cause**: Azure SQL Data Warehouse PolyBase cannot convert empty string to GUID.
309+
310+
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
311+
312+
### Error message: Expected data type: DECIMAL(x,x), Offending value
313+
314+
- **Symptoms**: When you copy data from tabular data source (such as SQL Server) into SQL DW using staged copy and PolyBase, you hit the following error:
315+
316+
```
317+
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
318+
Message=Error happened when loading data into SQL Data Warehouse.,
319+
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
320+
Message=Query aborted-- the maximum reject threshold (0 rows) was reached while reading from an external source: 1 rows rejected out of total 415 rows processed. (/file_name.txt)
321+
Column ordinal: 18, Expected data type: DECIMAL(x,x), Offending value:..
322+
```
323+
324+
- **Cause**: Azure SQL Data Warehouse Polybase cannot insert empty string (null value) into decimal column.
325+
326+
- **Resolution**: In Copy activity sink, under Polybase settings, set "**use type default**" option to false.
327+
328+
### Error message: Java exception message:HdfsBridge::CreateRecordReader
329+
330+
- **Symptoms**: You copy data into Azure SQL Data Warehouse using PolyBase, and hit the following error:
331+
332+
```
333+
Message=110802;An internal DMS error occurred that caused this operation to fail.
334+
Details: Exception: Microsoft.SqlServer.DataWarehouse.DataMovement.Common.ExternalAccess.HdfsAccessException,
335+
Message: Java exception raised on call to HdfsBridge_CreateRecordReader.
336+
Java exception message:HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.: Error [HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.] occurred while accessing external file.....
337+
```
338+
339+
- **Cause**: The possible cause is that the schema (total column width) being too large (larger than 1 MB). Check the schema of the target SQL DW table by adding the size of all columns:
340+
341+
- Int -> 4 bytes
342+
- Bigint -> 8 bytes
343+
- Varchar(n),char(n),binary(n), varbinary(n) -> n bytes
344+
- Nvarchar(n), nchar(n) -> n*2 bytes
345+
- Date -> 6 bytes
346+
- Datetime/(2), smalldatetime -> 16 bytes
347+
- Datetimeoffset -> 20 bytes
348+
- Decimal -> 19 bytes
349+
- Float -> 8 bytes
350+
- Money -> 8 bytes
351+
- Smallmoney -> 4 bytes
352+
- Real -> 4 bytes
353+
- Smallint -> 2 bytes
354+
- Time -> 12 bytes
355+
- Tinyint -> 1 byte
356+
357+
- **Resolution**: Reduce column width to be less than 1 MB
358+
359+
- Or use bulk insert approach by disabling Polybase
360+
361+
### Error message: The condition specified using HTTP conditional header(s) is not met
362+
363+
- **Symptoms**: You use SQL query to pull data from Azure SQL Data Warehouse and hit the following error:
364+
365+
```
366+
...StorageException: The condition specified using HTTP conditional header(s) is not met...
367+
```
368+
369+
- **Cause**: Azure SQL Data Warehouse hit issue querying the external table in Azure Storage.
370+
371+
- **Resolution**: Run the same query in SSMS and check if you see the same result. If yes, open a support ticket to Azure SQL Data Warehouse and provide your SQL DW server and database name to further troubleshoot.
372+
373+
374+
## Azure Blob Storage
375+
376+
### Error code: AzureBlobOperationFailed
377+
378+
- **Message**: `Blob operation Failed. ContainerName: %containerName;, path: %path;.`
379+
380+
- **Cause**: Blob storage operation hit problem.
381+
382+
- **Recommendation**: Please check the error in details. Please refer to blob help document: https://docs.microsoft.com/en-us/rest/api/storageservices/blob-service-error-codes. Contact storage team if need help.
383+
384+
385+
386+
## Azure Data Lake Gen2
387+
388+
### Error code: AdlsGen2OperationFailed
389+
390+
- **Message**: `ADLS Gen2 operation failed for: %adlsGen2Message;.%exceptionData;.`
391+
392+
- **Cause**: ADLS Gen2 throws the error indicating operation failed.
393+
394+
- **Recommendation**: Please check the detailed error message thrown by ADLS Gen2. If it's caused by transient failure, please retry. If you need further help, please contact Azure Storage support and provide the request ID in error message.
395+
396+
- **Cause**: When the error message contains 'Forbidden', the service principal or managed identity you use may not have enough permission to access the ADLS Gen2.
397+
398+
- **Recommendation**: Please refer to the help document: https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-data-lake-storage#service-principal-authentication.
399+
400+
- **Cause**: When the error message contains 'InternalServerError', the error is returned by ADLS Gen2.
401+
402+
- **Recommendation**: It may be caused by transient failure, please retry. If the issue persists, please contact Azure Storage support and provide the request ID in error message.
403+
238404
239405
## Next steps
240406
@@ -246,6 +412,4 @@ For more troubleshooting help, try these resources:
246412
* [MSDN forum](https://social.msdn.microsoft.com/Forums/home?sort=relevancedesc&brandIgnore=True&searchTerm=data+factory)
247413
* [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)
248414
* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
249-
250-
251-
415+

0 commit comments

Comments
 (0)