You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-upgrade-guidance.md
+7-1Lines changed: 7 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ ms.topic: concept-article
9
9
ms.custom:
10
10
- references_regions
11
11
- build-2025
12
-
ms.date: 08/06/2025
12
+
ms.date: 08/18/2025
13
13
---
14
14
15
15
# Connector upgrade guidance
@@ -110,11 +110,17 @@ You can find more details from the table below on the connector list that is pla
110
110
|[Amazon RDS for Oracle](connector-amazon-rds-for-oracle.md)| Scenario that doesn't rely on capability below in Oracle (version 1.0):<br><br>• Use procedureRetResults, batchFailureReturnsError, truststore, and truststorepassword as connection properties.<br>• Use query ends with the semicolon.<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
111
111
|[Amazon Redshift](connector-amazon-redshift.md)| Scenario that doesn't rely on below capability in Amazon Redshift (version 1.0):<br><br>• Linked service that uses Azure integration runtime.<br>• Use [UNLOAD](connector-amazon-redshift.md#use-unload-to-copy-data-from-amazon-redshift).<br><br>Automatic upgrade is only applicable when the driver is installed in your machine that installs the self-hosted integration runtime.<br><br> For more information, go to [Install Amazon Redshift ODBC driver for the version 2.0](connector-amazon-redshift.md#install-amazon-redshift-odbc-driver-for-the-version-20).|
112
112
|[Google BigQuery](connector-google-bigquery.md)| Scenario that doesn't rely on below capability in Google BigQuery V1:<br><br> • Use `trustedCertsPath`, `additionalProjects`, `requestgoogledrivescope` connection properties.<br> • Set `useSystemTrustStore` connection property as `false`.<br> • Use **STRUCT** and **ARRAY** data types. <br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above. |
113
+
|[Greenplum](connector-greenplum.md)| If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
113
114
|[Hive](connector-hive.md)| Scenario that doesn't rely on below capability in Hive (version 1.0):<br><br>• Authentication types:<br> • Username<br>• Thrift transport protocol:<br> • HiveServer1<br>• Service discovery mode: True<br>• Use native query: True <br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above.|
114
115
|[Impala](connector-impala.md)| Scenario that doesn't rely on below capability in Impala (version 1.0):<br><br>• Authentication types:<br> • SASL Username<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above. |
116
+
|[MariaDB](connector-mariadb.md)| If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
117
+
|[MySQL](connector-mysql.md)| If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
115
118
|[Oracle](connector-oracle.md)| Scenario that doesn't rely on capability below in Oracle (version 1.0):<br><br>• Use procedureRetResults, batchFailureReturnsError, truststore, and truststorepassword as connection properties.<br>• Use Oracle connector as sink.<br>• Use query ends with the semicolon.<br>• Use PL/SQL command in Script activity<br>• Use script parameters in Script activity<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
119
+
|[PostgreSQL](connector-postgresql.md)| If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
120
+
|[Presto](connector-presto.md)|Scenario that does not rely on capability below in Presto (version 1.0):<br><br>• Use MAP, ARRAY or ROW data types. <br>• trustedCertPath/allowSelfSignedServerCert/allowSelfSignedServerCert (will be supported soon) <br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.57 or above. |
116
121
|[Salesforce](connector-salesforce.md)| Scenario that doesn't rely on capability below in Salesforce V1:<br><br>• SOQL queries that use:<br> • TYPEOF clauses<br> • Compound address/geolocations fields<br>• All SQL-92 query<br>• Report query {call "\<report name>"}<br>• Use Self-hosted integration runtime (To be supported) |
117
122
|[Salesforce Service Cloud](connector-salesforce-service-cloud.md)| Scenario that doesn't rely on capability below in Salesforce Service Cloud V1:<br><br>• SOQL queries that use:<br> • TYPEOF clauses<br> • Compound address/geolocations fields<br>• All SQL-92 query<br>• Report query {call "\<report name>"}<br>• Use Self-hosted integration runtime (To be supported) |
123
+
|[Snowflake](connector-snowflake.md)| Scenario that does not rely on capability below in Snowflake V1:<br><br>• Use any of below<br> properties: connection_timeout, disableocspcheck, enablestaging, on_error, query_tag, quoted_identifiers_ignore_case, skip_header, stage, table, timezone, token, validate_utf8, no_proxy, nonproxyhosts, noproxy. <br>• Use multi-statement query in script activity or lookup activity. <br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
118
124
|[Spark](connector-spark.md)| Scenario that doesn't rely on below capability in Spark (version 1.0):<br><br>• Authentication types:<br> • Username<br>• Thrift transport protocol:<br> • SASL<br> • Binary<br>• Thrift transport protocol:<br> • SharkServer<br> • SharkServer2<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above.|
119
125
|[Teradata](connector-teradata.md)| Scenario that doesn't rely on below capability in Teradata (version 1.0):<br><br> • Set below value for **CharacterSet**:<br> • BIG5 (TCHBIG5_1R0)<br> • EUC (Unix compatible, KANJIEC_0U)<br> • GB (SCHGB2312_1T0)<br> • IBM Mainframe (KANJIEBCDIC5035_0I)<br> • NetworkKorean (HANGULKSC5601_2R4)<br> • Shift-JIS (Windows, DOS compatible, KANJISJIS_0S)|
120
126
|[Vertica](connector-vertica.md)| Scenario that doesn't rely on below capability in Vertica (version 1.0):<br><br>• Linked service that uses Azure integration runtime.<br><br>Automatic upgrade is only applicable when the driver is installed in your machine that installs the self-hosted integration runtime (version 5.55 or above).<br><br> For more information, go to [Install Vertica ODBC driver for the version 2.0](connector-vertica.md#install-vertica-odbc-driver-for-the-version-20). |
0 commit comments