You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-troubleshoot-synapse-sql.md
+33-40Lines changed: 33 additions & 40 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,17 @@
1
1
---
2
2
title: Troubleshoot the Azure Synapse Analytics, Azure SQL Database, and SQL Server connectors
3
3
titleSuffix: Azure Data Factory & Azure Synapse
4
-
description: Learn how to troubleshoot issues with the Azure Synapse Analytics, Azure SQL Database, and SQL Server connectors in Azure Data Factory and Azure Synapse Analytics.
4
+
description: Learn how to troubleshoot issues with the Azure Synapse Analytics, Azure SQL Database, and SQL Server connectors in Azure Data Factory and Azure Synapse Analytics.
5
5
author: jianleishen
6
+
ms.author: jianleishen
7
+
ms.reviewer: joanpo, wiassaf
8
+
ms.date: 09/02/2022
6
9
ms.service: data-factory
7
10
ms.subservice: data-movement
8
11
ms.topic: troubleshooting
9
-
ms.date: 06/29/2022
10
-
ms.author: jianleishen
11
-
ms.custom: has-adal-ref, synapse
12
+
ms.custom:
13
+
- has-adal-ref
14
+
-synapse
12
15
---
13
16
14
17
# Troubleshoot the Azure Synapse Analytics, Azure SQL Database, and SQL Server connectors in Azure Data Factory and Azure Synapse
@@ -30,7 +33,7 @@ This article provides suggestions to troubleshoot common problems with the Azure
30
33
| If the error message contains the string "SqlException", SQL Database the error indicates that some specific operation failed. | For more information, search by SQL error code in [Database engine errors](/sql/relational-databases/errors-events/database-engine-events-and-errors). For further help, contact Azure SQL support. |
31
34
| If this is a transient issue (for example, an instable network connection), add retry in the activity policy to mitigate. | For more information, see [Pipelines and activities](./concepts-pipelines-activities.md#activity-policy). |
32
35
| If the error message contains the string "Client with IP address '...' is not allowed to access the server", and you're trying to connect to Azure SQL Database, the error is usually caused by an Azure SQL Database firewall issue. | In the Azure SQL Server firewall configuration, enable the **Allow Azure services and resources to access this server** option. For more information, see [Azure SQL Database and Azure Synapse IP firewall rules](/azure/azure-sql/database/firewall-configure). |
33
-
36
+
34
37
## Error code: SqlOperationFailed
35
38
36
39
-**Message**: `A database operation failed. Please search error to get more details.`
@@ -44,7 +47,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
44
47
| If the error message contains the string "InvalidOperationException", it's usually caused by invalid input data. | To identify which row has encountered the problem, enable the fault tolerance feature on the copy activity, which can redirect problematic rows to the storage for further investigation. For more information, see [Fault tolerance of copy activity](./copy-activity-fault-tolerance.md). |
45
48
| If the error message contains "Execution Timeout Expired", it's usually caused by query timeout. | Configure **Query timeout** in the source and **Write batch timeout** in the sink to increase timeout. |
46
49
47
-
48
50
## Error code: SqlUnauthorizedAccess
49
51
50
52
-**Message**: `Cannot connect to '%connectorName;'. Detail Message: '%message;'`
@@ -53,7 +55,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
53
55
54
56
-**Recommendation**: Check to ensure that the login account has sufficient permissions to access the SQL database.
55
57
56
-
57
58
## Error code: SqlOpenConnectionTimeout
58
59
59
60
-**Message**: `Open connection to database timeout after '%timeoutValue;' seconds.`
@@ -62,7 +63,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
62
63
63
64
-**Recommendation**: Retry the operation to update the linked service connection string with a larger connection timeout value.
64
65
65
-
66
66
## Error code: SqlAutoCreateTableTypeMapFailed
67
67
68
68
-**Message**: `Type '%dataType;' in source side cannot be mapped to a type that supported by sink side(column name:'%columnName;') in autocreate table.`
@@ -71,7 +71,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
71
71
72
72
-**Recommendation**: Update the column type in *mappings*, or manually create the sink table in the target server.
73
73
74
-
75
74
## Error code: SqlDataTypeNotSupported
76
75
77
76
-**Message**: `A database operation failed. Check the SQL errors.`
@@ -84,7 +83,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
84
83
85
84
-**Recommendation**: Update the corresponding column type to the *datetime2* type in the sink table.
86
85
87
-
88
86
## Error code: SqlInvalidDbStoredProcedure
89
87
90
88
-**Message**: `The specified Stored Procedure is not valid. It could be caused by that the stored procedure doesn't return any data. Invalid Stored Procedure script: '%scriptName;'.`
@@ -104,7 +102,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
104
102
105
103
-**Recommendation**: Validate the SQL query by using SQL Tools. Make sure that the query can return data.
106
104
107
-
108
105
## Error code: SqlInvalidColumnName
109
106
110
107
-**Message**: `Column '%column;' does not exist in the table '%tableName;', ServerName: '%serverName;', DatabaseName: '%dbName;'.`
@@ -113,7 +110,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
113
110
114
111
-**Recommendation**: Verify the column in the query, *structure* in the dataset, and *mappings* in the activity.
115
112
116
-
117
113
## Error code: SqlBatchWriteTimeout
118
114
119
115
-**Message**: `Timeouts in SQL write operation.`
@@ -122,7 +118,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
122
118
123
119
-**Recommendation**: Retry the operation. If the problem persists, contact Azure SQL support.
124
120
125
-
126
121
## Error code: SqlBatchWriteTransactionFailed
127
122
128
123
-**Message**: `SQL transaction commits failed.`
@@ -135,7 +130,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
135
130
136
131
-**Recommendation**: Retry the activity and review the SQL database side metrics.
137
132
138
-
139
133
## Error code: SqlBulkCopyInvalidColumnLength
140
134
141
135
-**Message**: `SQL Bulk Copy failed due to receive an invalid column length from the bcp client.`
@@ -144,7 +138,6 @@ This article provides suggestions to troubleshoot common problems with the Azure
144
138
145
139
-**Recommendation**: To identify which row has encountered the problem, enable the fault tolerance feature on the copy activity. This can redirect problematic rows to the storage for further investigation. For more information, see [Fault tolerance of copy activity](./copy-activity-fault-tolerance.md).
146
140
147
-
148
141
## Error code: SqlConnectionIsClosed
149
142
150
143
-**Message**: `The connection is closed by SQL Database.`
@@ -159,15 +152,15 @@ This article provides suggestions to troubleshoot common problems with the Azure
159
152
160
153
-**Cause**: The linked service was not configured properly.
161
154
162
-
-**Recommendation**: Validate and fix the SQL server linked service.
155
+
-**Recommendation**: Validate and fix the SQL server linked service.
@@ -189,37 +182,35 @@ This article provides suggestions to troubleshoot common problems with the Azure
189
182
190
183
-**Symptoms**: When you copy data from a tabular data source (such as SQL Server) into Azure Synapse Analytics using staged copy and PolyBase, you receive the following error:
Message=Conversion failed when converting from a character string to uniqueidentifier...`
196
189
197
190
-**Cause**: Azure Synapse Analytics PolyBase can't convert an empty string to a GUID.
198
191
199
192
-**Resolution**: In the copy activity sink, under PolyBase settings, set the **use type default** option to *false*.
200
193
201
-
202
194
## Error message: Expected data type: DECIMAL(x,x), Offending value
203
195
204
196
-**Symptoms**: When you copy data from a tabular data source (such as SQL Server) into Azure Synapse Analytics by using staged copy and PolyBase, you receive the following error:
Message=Query aborted-- the maximum reject threshold (0 rows) was reached while reading from an external source: 1 rows rejected out of total 415 rows processed. (/file_name.txt)
210
202
Column ordinal: 18, Expected data type: DECIMAL(x,x), Offending value:..`
211
203
212
204
-**Cause**: Azure Synapse Analytics PolyBase can't insert an empty string (null value) into a decimal column.
213
205
214
206
-**Resolution**: In the copy activity sink, under PolyBase settings, set the **use type default** option to false.
Message: Java exception raised on call to HdfsBridge_CreateRecordReader.
224
215
Java exception message:HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.: Error [HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.] occurred while accessing external file.....`
225
216
@@ -241,11 +232,10 @@ This article provides suggestions to troubleshoot common problems with the Azure
241
232
- Time = 12 bytes
242
233
- Tinyint = 1 byte
243
234
244
-
-**Resolution**:
235
+
-**Resolution**:
245
236
- Reduce column width to less than 1 MB.
246
237
- Or use a bulk insert approach by disabling PolyBase.
247
238
248
-
249
239
## Error message: The condition specified using HTTP conditional header(s) is not met
250
240
251
241
-**Symptoms**: You use SQL query to pull data from Azure Synapse Analytics and receive the following error:
@@ -256,48 +246,51 @@ This article provides suggestions to troubleshoot common problems with the Azure
256
246
257
247
-**Resolution**: Run the same query in SQL Server Management Studio (SSMS) and check to see whether you get the same result. If you do, open a support ticket to Azure Synapse Analytics and provide your Azure Synapse Analytics server and database name.
258
248
259
-
260
249
## Performance tier is low and leads to copy failure
261
250
262
251
-**Symptoms**: You copy data into Azure SQL Database and receive the following error: `Database operation failed. Error message from database execution : ExecuteNonQuery requires an open and available Connection. The connection's current state is closed.`
263
252
264
253
-**Cause**: Azure SQL Database s1 has hit input/output (I/O) limits.
265
254
266
-
-**Resolution**: Upgrade the Azure SQL Database performance tier to fix the issue.
267
-
255
+
-**Resolution**: Upgrade the Azure SQL Database performance tier to fix the issue.
268
256
269
-
## SQL table can't be found
257
+
## SQL table can't be found
270
258
271
259
-**Symptoms**: You copy data from hybrid into an on-premises SQL Server table and receive the following error:`Cannot find the object "dbo.Contoso" because it does not exist or you do not have permissions.`
272
260
273
261
-**Cause**: The current SQL account doesn't have sufficient permissions to execute requests issued by .NET SqlBulkCopy.WriteToServer.
274
262
275
263
-**Resolution**: Switch to a more privileged SQL account.
276
264
277
-
278
265
## Error message: String or binary data is truncated
279
266
280
-
-**Symptoms**: An error occurs when you copy data into an on-premises Azure SQL Server table.
267
+
-**Symptoms**: An error occurs when you copy data into an on-premises Azure SQL Server table.
281
268
282
-
-**Cause**: The Cx SQL table schema definition has one or more columns with less length than expected.
269
+
-**Cause**: The SQL table schema definition has one or more columns with less length than expected.
283
270
284
271
-**Resolution**: To resolve the issue, try the following:
285
272
286
-
1. To troubleshoot which rows have the issue, apply SQL sink [fault tolerance](./copy-activity-fault-tolerance.md), especially "redirectIncompatibleRowSettings."
273
+
1. To troubleshoot which rows have the issue, apply SQL sink [fault tolerance](./copy-activity-fault-tolerance.md), especially `redirectIncompatibleRowSettings`.
287
274
288
-
> [!NOTE]
289
-
> Fault tolerance might require additional execution time, which could lead to higher costs.
275
+
> [!NOTE]
276
+
> Fault tolerance might require additional execution time, which could lead to higher costs.
290
277
291
-
2. Double-check the redirected data against the SQL table schema column length to see which columns need to be updated.
278
+
1. Double-check the redirected data against the SQL table schema column length to see which columns need to be updated.
292
279
293
-
3. Update the table schema accordingly.
280
+
1. Update the table schema accordingly.
294
281
295
282
## Error code: FailedDbOperation
296
283
297
284
-**Message**: `User does not have permission to perform this action.`
298
285
299
286
-**Recommendation**: Make sure the user configured in the Azure Synapse Analytics connector must have 'CONTROL' permission on the target database while using PolyBase to load data. For more detailed information, refer to this [document](./connector-azure-sql-data-warehouse.md#required-database-permission).
300
287
288
+
## Error code: Msg 105208
289
+
290
+
-**Symptoms**: Error code: `Error code: Msg 105208, Level 16, State 1, Line 1 COPY statement failed with the following error when validating value of option 'FROM': '105200;COPY statement failed because the value for option 'FROM' is invalid.'`
291
+
-**Cause**: Currently, ingesting data using the COPY command into an Azure Storage account that is using the new DNS partitioning feature results in an error. DNS partition feature enables customers to create up to 5000 storage accounts per subscription.
292
+
-**Resolutions**: Provision a storage account in a subscription that does not use the new [Azure Storage DNS partition feature](https://techcommunity.microsoft.com/t5/azure-storage-blog/public-preview-create-additional-5000-azure-storage-accounts/ba-p/3465466) (currently in Public Preview).
293
+
301
294
## Next steps
302
295
303
296
For more troubleshooting help, try these resources:
Copy file name to clipboardExpand all lines: articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-load-from-azure-data-lake-store.md
+16-16Lines changed: 16 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,13 @@
1
1
---
2
-
title: 'Tutorial load data from Azure Data Lake Storage'
2
+
title: "Tutorial load data from Azure Data Lake Storage"
3
3
description: Use the COPY statement to load data from Azure Data Lake Storage for dedicated SQL pools.
4
4
author: WilliamDAssafMSFT
5
-
manager: craigg
5
+
ms.author: wiassaf
6
+
ms.reviewer: joanpo
7
+
ms.date: 09/02/2022
6
8
ms.service: synapse-analytics
9
+
ms.subservice: sql-dw
7
10
ms.topic: conceptual
8
-
ms.subservice: sql-dw
9
-
ms.date: 11/20/2020
10
-
ms.author: wiassaf
11
-
ms.reviewer: wiassaf
12
11
ms.custom: azure-synapse
13
12
---
14
13
@@ -33,11 +32,12 @@ Before you begin this tutorial, download and install the newest version of [SQL
33
32
To run this tutorial, you need:
34
33
35
34
* A dedicated SQL pool. See [Create a dedicated SQL pool and query data](create-data-warehouse-portal.md).
36
-
* A Data Lake Storage account. See [Get started with Azure Data Lake Storage](../../data-lake-store/data-lake-store-get-started-portal.md?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json). For this storage account, you will need to configure or specify one of the following credentials to load: A storage account key, shared access signature (SAS) key, an Azure Directory Application user, or an AAD user which has the appropriate Azure role to the storage account.
35
+
* A Data Lake Storage account. See [Get started with Azure Data Lake Storage](../../data-lake-store/data-lake-store-get-started-portal.md?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json). For this storage account, you will need to configure or specify one of the following credentials to load: A storage account key, shared access signature (SAS) key, an Azure Directory Application user, or an Azure AD user that has the appropriate Azure role to the storage account.
36
+
* Currently, ingesting data using the COPY command into an Azure Storage account that is using the new [Azure Storage DNS partition feature](https://techcommunity.microsoft.com/t5/azure-storage-blog/public-preview-create-additional-5000-azure-storage-accounts/ba-p/3465466) results in an error. Provision a storage account in a subscription that does not use DNS partitioning for this tutorial.
37
37
38
38
## Create the target table
39
39
40
-
Connect to your dedicated SQL pool and create the target table you will to load to. In this example, we are creating a product dimension table.
40
+
Connect to your dedicated SQL pool and create the target table you will load to. In this example, we are creating a product dimension table.
41
41
42
42
```sql
43
43
-- A: Create the target table
@@ -56,26 +56,25 @@ WITH
56
56
);
57
57
```
58
58
59
-
60
59
## Create the COPY statement
61
60
62
61
Connect to your SQL dedicated pool and run the COPY statement. For a complete list of examples, visit the following documentation: [Securely load data using dedicated SQL pools](./quickstart-bulk-load-copy-tsql-examples.md).
63
62
64
63
```sql
65
64
-- B: Create and execute the COPY statement
66
65
67
-
COPY INTO [dbo].[DimProduct]
68
-
--The column list allows you map, omit, or reorder input file columns to target table columns.
66
+
COPY INTO [dbo].[DimProduct]
67
+
--The column list allows you map, omit, or reorder input file columns to target table columns.
69
68
--You can also specify the default value when there is a NULL value in the file.
70
69
--When the column list is not specified, columns will be mapped based on source and target ordinality
71
70
(
72
-
ProductKey default -11,
73
-
ProductLabel default 'myStringDefaultWhenNull'2,
74
-
ProductName default 'myStringDefaultWhenNull'3
71
+
ProductKey default -11,
72
+
ProductLabel default 'myStringDefaultWhenNull'2,
73
+
ProductName default 'myStringDefaultWhenNull'3
75
74
)
76
75
--The storage account location where you data is staged
0 commit comments