You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-troubleshoot-guide.md
+37-37Lines changed: 37 additions & 37 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -44,7 +44,7 @@ You can refer to the troubleshooting pages for each connector to see problems sp
44
44
45
45
The errors below are general to the copy activity and could occur with any connector.
46
46
47
-
#### Error code: JreNotFound
47
+
#### Error code: 20000
48
48
49
49
-**Message**: `Java Runtime Environment cannot be found on the Self-hosted Integration Runtime machine. It is required for parsing or writing to Parquet/ORC files. Make sure Java Runtime Environment has been installed on the Self-hosted Integration Runtime machine.`
50
50
@@ -53,7 +53,15 @@ The errors below are general to the copy activity and could occur with any conne
53
53
-**Recommendation**: Check your integration runtime environment, see [Use Self-hosted Integration Runtime](./format-parquet.md#using-self-hosted-integration-runtime).
54
54
55
55
56
-
#### Error code: WildcardPathSinkNotSupported
56
+
#### Error code: 20002
57
+
58
+
-**Message**: `An error occurred when invoking Java Native Interface.`
59
+
60
+
-**Cause**: If the error message contains "Cannot create JVM: JNI return code [-6][JNI call failed: Invalid arguments.]", the possible cause is that JVM can't be created because some illegal (global) arguments are set.
61
+
62
+
-**Recommendation**: Log in to the machine that hosts *each node* of your self-hosted integration runtime. Check to ensure that the system variable is set correctly, as follows: `_JAVA_OPTIONS "-Xms256m -Xmx16g" with memory bigger than 8G`. Restart all the integration runtime nodes, and then rerun the pipeline.
63
+
64
+
#### Error code: 20020
57
65
58
66
-**Message**: `Wildcard in path is not supported in sink dataset. Fix the path: '%setting;'.`
59
67
@@ -80,53 +88,45 @@ The errors below are general to the copy activity and could occur with any conne
80
88
81
89
3. Save the file, and then restart the Self-hosted IR machine.
82
90
83
-
#### Error code: JniException
84
-
85
-
-**Message**: `An error occurred when invoking Java Native Interface.`
86
-
87
-
-**Cause**: If the error message contains "Cannot create JVM: JNI return code [-6][JNI call failed: Invalid arguments.]", the possible cause is that JVM can't be created because some illegal (global) arguments are set.
88
-
89
-
-**Recommendation**: Log in to the machine that hosts *each node* of your self-hosted integration runtime. Check to ensure that the system variable is set correctly, as follows: `_JAVA_OPTIONS "-Xms256m -Xmx16g" with memory bigger than 8G`. Restart all the integration runtime nodes, and then rerun the pipeline.
-**Message**: `'validateDataConsistency' is not supported in this version ('%version;') of Self Hosted Integration Runtime.`
170
170
171
171
-**Recommendation**: Check the supported integration runtime version and upgrade it to a higher version, or remove the 'validateDataConsistency' property from copy activities.
-**Message**: `Skip inconsistency is not supported in current copy activity settings, it's only supported with direct binary copy when validateDataConsistency is true.`
182
182
183
183
-**Recommendation**: Remove 'dataInconsistency' of the skipErrorFile setting in the copy activity payload.
-**Message**: `'deleteFilesAfterCompletion' is not supported for this connector: ('%connectorName;').`
230
230
231
231
-**Recommendation**: Remove the 'deleteFilesAfterCompletion' setting in the copy activity payload.
232
232
233
-
#### Error code: FailedToDownloadCustomPlugins
233
+
#### Error code: 27002
234
234
235
235
-**Message**: `Failed to download custom plugins.`
236
236
@@ -240,21 +240,21 @@ The errors below are general to the copy activity and could occur with any conne
240
240
241
241
## General connector errors
242
242
243
-
#### Error code: UserErrorOdbcInvalidQueryString
243
+
#### Error code: 9611
244
244
245
245
-**Message**: `The following ODBC Query is not valid: '%'.`
246
246
247
247
-**Cause**: You provide a wrong or invalid query to fetch the data/schemas.
248
248
249
249
-**Recommendation**: Verify your query is valid and can return data/schemas. Use [Script activity](transform-data-using-script.md) if you want to execute non-query scripts and your data store is supported. Alternatively, consider to use stored procedure that returns a dummy result to execute your non-query scripts.
-**Message**: `The parameters and expression cannot be resolved for schema operations. …The template function 'linkedService' is not defined or not valid.`
253
+
-**Message**: `Failed to connect to your instance of Azure Database for PostgreSQL flexible server.`
254
254
255
-
-**Cause**: The service has limitation to support the linked service which references another linked service with parameters for test connection or preview data. For example, passing a parameter from a Key Vault to a linked service may occur the issue.
255
+
-**Cause**: User or password provided are incorrect. The encryption method selected is not compatible with the configuration of the server. The network connectivity method configured for your instance doesn't allow connections from the Integration Runtime selected.
256
256
257
-
-**Recommendation**: Remove the parameters in the referred linked service to eliminate the error. Otherwise, run the pipeline without testing connection or previewing data.
257
+
-**Recommendation**: Confirm that the user provided exists in your instance of PostgreSQL and that the password corresponds to the one currently assigned to that user. Make sure that the encryption method selected is accepted by your instance of PostgreSQL, based on its current configuration. If the network connectivity method of your instance is configured for Private access (VNet integration), use a Self-Hosted Integration Runtime (IR) to connect to it. If it is configured for Public access (allowed IP addresses), it is recommended to use an Azure IR with managed virtual network and deploy a managed private endpoint to connect to your instance. When it is configured for Public access (allowed IP addresses) a less recommended alternative consists in creating firewall rules in your instance to allow traffic originating on the IP addresses used by the Azure IR you're using.
This article provides suggestions to troubleshoot common problems with the Azure Database for PostgreSQL connector in Azure Data Factory and Azure Synapse.
-**Message**: `The data type of the chosen Partition Column, '%partitionColumn;', is '%dataType;' and this data type is not supported for partitioning.`
23
-
24
-
-**Recommendation**: Pick a partition column with int, bigint, smallint, serial, bigserial, smallserial, timestamp with or without time zone, time without time zone or date data type.
-**Message**: `Partition column name must be specified.`
29
23
30
24
-**Cause**: No partition column name is provided, and it couldn't be decided automatically.
31
25
26
+
## Error code: 23705
27
+
28
+
-**Message**: `The data type of the chosen Partition Column, '%partitionColumn;', is '%dataType;' and this data type is not supported for partitioning.`
29
+
30
+
-**Recommendation**: Pick a partition column with int, bigint, smallint, serial, bigserial, smallserial, timestamp with or without time zone, time without time zone or date data type.
31
+
32
32
## Related content
33
33
34
34
For more troubleshooting help, try these resources:
0 commit comments