Skip to content

Commit f7b3c50

Browse files
Clare Zheng (Shanghai Wicresoft Co Ltd)Clare Zheng (Shanghai Wicresoft Co Ltd)
authored andcommitted
Update according to comments
1 parent 195bf39 commit f7b3c50

5 files changed

+36
-11
lines changed

articles/data-factory/connector-deprecation-frequently-asked-questions.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,15 @@ ms.author: krirukm
66
ms.service: azure-data-factory
77
ms.subservice: data-movement
88
ms.topic: concept-article
9-
ms.date: 05/27/2025
9+
ms.date: 07/11/2025
1010
ms.custom:
1111
- build-2025
1212
---
1313

1414
# Connector upgrade FAQ
1515

16+
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
17+
1618
This article provides answers to frequently asked questions about connector upgrade.
1719

1820
## Why does Azure Data Factory (ADF) release new connectors and ask users to upgrade their existing connectors?

articles/data-factory/connector-lifecycle-overview.md

Lines changed: 24 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ ms.date: 07/10/2025
1414

1515
# Connector lifecycle overview
1616

17+
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
18+
1719
In Azure Data Factory, the introduction of the connector lifecycle ensures that customers always have access to the most reliable, secure, and feature-rich connectors. With the structured lifecycle stages, major connector upgrade evolves through distinct lifecycle stages, from preview to general availability and end of support, providing clear expectations for stability, support, and future enhancements. This lifecycle framework guarantees that users can seamlessly adopt new connectors with confidence, benefit from regular performance and security updates, and prepare in advance for any phase-out of older versions. By utilizing versioning within the connector lifecycle, the service empowers users with a predictable, transparent, and future-proof integration experience, reducing operational risks and enhancing overall workload reliability.
1820

1921
## Release rhythm
@@ -80,21 +82,36 @@ These auto-upgraded workloads are not affected by the announced removal date of
8082

8183
You can identify which activities have been automatically upgraded by inspecting the activity output, where relevant upgrade information is recorded.
8284

85+
**Example:**
86+
87+
Copy activity output
88+
89+
```json
90+
"source": {
91+
"type": "AmazonS3",
92+
"autoUpgrade": "true",
93+
}
94+
95+
"sink": {
96+
"type": "AmazonS3",
97+
"autoUpgrade": "true",
98+
}
99+
```
100+
83101
> [!NOTE]
84102
> While compatibility mode offers flexibility, we strongly encourage users to upgrade to the latest GA version as soon as possible to benefit from ongoing improvements, optimizations, and full support.
85103
86104
You can find more details from the table below on the connector list that is planned for the automatic upgrade.
87105

88106
| Connector | Scenario |
89107
|------------------|----------|
90-
| [Google BigQuery](connector-google-bigquery.md) | Scenario that does not rely on below capability in Google BigQuery V1:<br><br> • Use `trustedCertsPath`, `additionalProjects`, `requestgoogledrivescope` connection properties.<br> • Set `useSystemTrustStore` connection property as `false`.<br> • Use **STRUCT** and **ARRAY** data types. |
91-
| [Teradata](connector-teradata.md) | Scenario that does not rely on below capability in Teradata (version 1.0):<br><br> • Set below value for **CharacterSet**:<br>&nbsp;&nbsp;• BIG5 (TCHBIG5_1R0)<br>&nbsp;&nbsp;• EUC (Unix compatible, KANJIEC_0U)<br>&nbsp;&nbsp;• GB (SCHGB2312_1T0)<br>&nbsp;&nbsp;• IBM Mainframe (KANJIEBCDIC5035_0I)<br>&nbsp;&nbsp;• NetworkKorean (HANGULKSC5601_2R4)<br>&nbsp;&nbsp;• Shift-JIS (Windows, DOS compatible, KANJISJIS_0S)|
92-
| [Spark](connector-spark.md) | Scenario that does not rely on below capability in Spark (version 1.0):<br><br>• Authentication types:<br>&nbsp;&nbsp;• Username<br>• Thrift transport protocol:<br>&nbsp;&nbsp;• SASL<br>&nbsp;&nbsp;• Binary<br>• Thrift transport protocol:<br>&nbsp;&nbsp;• SharkServer<br>&nbsp;&nbsp;• SharkServer2 |
93-
| [Impala](connector-impala.md) | Scenario that does not rely on below capability in Impala (version 1.0):<br><br>• Authentication types:<br>&nbsp;&nbsp;• SASL Username |
108+
| [Amazon Redshift](connector-amazon-redshift.md) | Scenario that does not rely on below capability in Amazon Redshift (version 1.0):<br><br>• Linked service that uses Azure integration runtime.<br>• Use [UNLOAD](connector-amazon-redshift.md#use-unload-to-copy-data-from-amazon-redshift).<br><br>Automatic upgrade is only applicable when the driver is installed in your machine that installs the self-hosted integration runtime (version 5.56 or above).<br><br> For more information, go to [Install Amazon Redshift ODBC driver for the version 2.0](connector-amazon-redshift.md#install-amazon-redshift-odbc-driver-for-the-version-20).|
109+
| [Google BigQuery](connector-google-bigquery.md) | Scenario that does not rely on below capability in Google BigQuery V1:<br><br> • Use `trustedCertsPath`, `additionalProjects`, `requestgoogledrivescope` connection properties.<br> • Set `useSystemTrustStore` connection property as `false`.<br> • Use **STRUCT** and **ARRAY** data types. <br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above. |
94110
| [Hive](connector-hive.md) | Scenario that does not rely on below capability in Hive (version 1.0):<br><br>• Authentication types:<br>&nbsp;&nbsp;• Username<br>• Thrift transport protocol:<br>&nbsp;&nbsp;• HiveServer1<br>• Service discovery mode: True<br>• Use native query: True |
95-
| [Vertica](connector-vertica.md) | Scenario that does not rely on below capability in Vertica (version 1.0):<br><br>• Linked service that uses Azure integration runtime.<br><br>Automatic upgrade is only applicable when the driver is installed in your machine that installs the self-hosted integration runtime.<br><br> For more information, go to [Install Vertica ODBC driver for the version 2.0](connector-vertica.md#install-vertica-odbc-driver-for-the-version-20). |
96-
| [Amazon Redshift](connector-amazon-redshift.md) | Scenario that does not rely on below capability in Amazon Redshift (version 1.0):<br><br>• Linked service that uses Azure integration runtime.<br>• Use [UNLOAD](connector-amazon-redshift.md#use-unload-to-copy-data-from-amazon-redshift).<br><br>Automatic upgrade is only applicable when the driver is installed in your machine that installs the self-hosted integration runtime.|
97-
111+
| [Impala](connector-impala.md) | Scenario that does not rely on below capability in Impala (version 1.0):<br><br>• Authentication types:<br>&nbsp;&nbsp;• SASL Username<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above. |
112+
| [Spark](connector-spark.md) | Scenario that does not rely on below capability in Spark (version 1.0):<br><br>• Authentication types:<br>&nbsp;&nbsp;• Username<br>• Thrift transport protocol:<br>&nbsp;&nbsp;• SASL<br>&nbsp;&nbsp;• Binary<br>• Thrift transport protocol:<br>&nbsp;&nbsp;• SharkServer<br>&nbsp;&nbsp;• SharkServer2<br><br>If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.55 or above.|
113+
| [Teradata](connector-teradata.md) | Scenario that does not rely on below capability in Teradata (version 1.0):<br><br> • Set below value for **CharacterSet**:<br>&nbsp;&nbsp;• BIG5 (TCHBIG5_1R0)<br>&nbsp;&nbsp;• EUC (Unix compatible, KANJIEC_0U)<br>&nbsp;&nbsp;• GB (SCHGB2312_1T0)<br>&nbsp;&nbsp;• IBM Mainframe (KANJIEBCDIC5035_0I)<br>&nbsp;&nbsp;• NetworkKorean (HANGULKSC5601_2R4)<br>&nbsp;&nbsp;• Shift-JIS (Windows, DOS compatible, KANJISJIS_0S)<br><br> If your pipeline runs on self-hosted integration runtime, it requires SHIR version 5.56 or above.|
114+
| [Vertica](connector-vertica.md) | Scenario that does not rely on below capability in Vertica (version 1.0):<br><br>• Linked service that uses Azure integration runtime.<br><br>Automatic upgrade is only applicable when the driver is installed in your machine that installs the self-hosted integration runtime (version 5.55 or above).<br><br> For more information, go to [Install Vertica ODBC driver for the version 2.0](connector-vertica.md#install-vertica-odbc-driver-for-the-version-20). |
98115

99116

100117
## Related content

articles/data-factory/connector-release-stages-and-timelines.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,11 +7,13 @@ ms.service: azure-data-factory
77
ms.subservice: data-movement
88
ms.topic: concept-article
99
ms.custom: references_regions
10-
ms.date: 06/17/2025
10+
ms.date: 07/11/2025
1111
---
1212

1313
# Connector release stages and timelines
1414

15+
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
16+
1517
This article provides an overview of the release stages and timelines for each connector available in Azure Data Factory.
1618
For comprehensive details on support levels and recommended usage at each stage, please see [this article](connector-lifecycle-overview.md#release-rhythm).
1719

articles/data-factory/connector-upgrade-advisor.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,13 @@ ms.topic: concept-article
99
ms.custom:
1010
- references_regions
1111
- build-2025
12-
ms.date: 06/30/2025
12+
ms.date: 07/11/2025
1313
---
1414

1515
# Connector upgrade advisor in Azure Data Factory and Azure Synapse Analytics
1616

17+
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
18+
1719
This article describes Connector upgrade advisor in Azure Data Factory and Azure Synapse Analytics.
1820

1921
To learn more, see [Upgrade plan for Azure Data Factory connectors](connector-deprecation-plan.md).

articles/data-factory/connector-upgrade-guidance.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,13 @@ ms.topic: concept-article
99
ms.custom:
1010
- references_regions
1111
- build-2025
12-
ms.date: 06/06/2025
12+
ms.date: 07/11/2025
1313
---
1414

1515
# Connector upgrade guidance
1616

17+
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
18+
1719
This article provides guidance for upgrading connectors in Azure Data Factory.
1820

1921
## How to receive notifications in Azure Service Health portal

0 commit comments

Comments
 (0)