You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-upgrade-guidance.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -107,12 +107,12 @@ You can find more details from the table below on the connector list that is pla
107
107
108
108
| Connector | Scenario |
109
109
|------------------|----------|
110
-
|[Amazon RDS for Oracle](connector-amazon-rds-for-oracle.md)| Scenario that does not rely on capability below in Oracle (version 1.0):<br><br>• Use procedureRetResults, batchFailureReturnsError, truststore and truststorepassword as connection properties.<br>• Use query ends with the semicolon.<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
110
+
|[Amazon RDS for Oracle](connector-amazon-rds-for-oracle.md)| Scenario that doesn't rely on capability below in Oracle (version 1.0):<br><br>• Use procedureRetResults, batchFailureReturnsError, truststore, and truststorepassword as connection properties.<br>• Use query ends with the semicolon.<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
111
111
|[Amazon Redshift](connector-amazon-redshift.md)| Scenario that doesn't rely on below capability in Amazon Redshift (version 1.0):<br><br>• Linked service that uses Azure integration runtime.<br>• Use [UNLOAD](connector-amazon-redshift.md#use-unload-to-copy-data-from-amazon-redshift).<br><br>Automatic upgrade is only applicable when the driver is installed in your machine that installs the self-hosted integration runtime.<br><br> For more information, go to [Install Amazon Redshift ODBC driver for the version 2.0](connector-amazon-redshift.md#install-amazon-redshift-odbc-driver-for-the-version-20).|
112
112
|[Google BigQuery](connector-google-bigquery.md)| Scenario that doesn't rely on below capability in Google BigQuery V1:<br><br> • Use `trustedCertsPath`, `additionalProjects`, `requestgoogledrivescope` connection properties.<br> • Set `useSystemTrustStore` connection property as `false`.<br> • Use **STRUCT** and **ARRAY** data types. <br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above. |
113
113
|[Hive](connector-hive.md)| Scenario that doesn't rely on below capability in Hive (version 1.0):<br><br>• Authentication types:<br> • Username<br>• Thrift transport protocol:<br> • HiveServer1<br>• Service discovery mode: True<br>• Use native query: True <br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above.|
114
114
|[Impala](connector-impala.md)| Scenario that doesn't rely on below capability in Impala (version 1.0):<br><br>• Authentication types:<br> • SASL Username<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above. |
115
-
|[Oracle](connector-oracle.md)| Scenario that does not rely on capability below in Oracle (version 1.0):<br><br>• Use procedureRetResults, batchFailureReturnsError, truststore and truststorepassword as connection properties.<br>• Use Oracle connector as sink.<br>• Use query ends with the semicolon.<br>• Use PL/SQL command in Script activity<br>• Use script parameters in Script activity<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
115
+
|[Oracle](connector-oracle.md)| Scenario that doesn't rely on capability below in Oracle (version 1.0):<br><br>• Use procedureRetResults, batchFailureReturnsError, truststore, and truststorepassword as connection properties.<br>• Use Oracle connector as sink.<br>• Use query ends with the semicolon.<br>• Use PL/SQL command in Script activity<br>• Use script parameters in Script activity<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above. |
116
116
|[Salesforce](connector-salesforce.md)| Scenario that doesn't rely on capability below in Salesforce V1:<br><br>• SOQL queries that use:<br> • TYPEOF clauses<br> • Compound address/geolocations fields<br>• All SQL-92 query<br>• Report query {call "\<report name>"}<br>• Use Self-hosted integration runtime (To be supported) |
117
117
|[Salesforce Service Cloud](connector-salesforce-service-cloud.md)| Scenario that doesn't rely on capability below in Salesforce Service Cloud V1:<br><br>• SOQL queries that use:<br> • TYPEOF clauses<br> • Compound address/geolocations fields<br>• All SQL-92 query<br>• Report query {call "\<report name>"}<br>• Use Self-hosted integration runtime (To be supported) |
118
118
|[Spark](connector-spark.md)| Scenario that doesn't rely on below capability in Spark (version 1.0):<br><br>• Authentication types:<br> • Username<br>• Thrift transport protocol:<br> • SASL<br> • Binary<br>• Thrift transport protocol:<br> • SharkServer<br> • SharkServer2<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above.|
0 commit comments